Articles & Content


ServerFault is a Q&A site for a community of system administrators and IT professionals

Note: The content of this blog is the opinion and thoughts of the blogger and does not necessarily represent the opinions of IDUG.


How to connect to DB2 when the password ends with '!' in Windows

I am facing a problem to use the DB2 tools when using generic account with a generated password which ends with the Bang sign '!' to connect to DB2 database. I am not allowed to change the password because it is already used by other processes.

I know the user is valid and I can connect to the database with its credentials, but not from all db2 tools.

When using the Control Center it is okay.When using the Command Editor (GUI) or the Command Windows, I got this error message:

connect to WAREHOUS user administrator using !SQL0104N An unexpected token "!" was found following "". Expected tokens may include: "NEW". SQLSTATE=42601

Let's say that my password is: pass@!I am trying to use

c:\>db2 connect to sample user administrator using "pass@!"


c:\>db2 connect to sample user administrator using pass@!

And it both cases I got the same error message.

I could change the way I connect but it is not useful for me, for example:

c:\>db2 connect to sample user administratorEnter current password for administrator:

But I cannot use it from a batch file easily.

I would like to know how can I connect from the Command Editor, in order to use this user from the Graphical Tools.

BTW, I know that the Control Center is deprecated.

Example pyodbc db2 connection [closed]

I am trying to get a pyodbc connection (on a Centos machine), to a DB2 database running on an IBM i.

Whatever connection string or configuration I add, I always get the same message ..

 Traceback (most recent call last): File "", line 1, in  pyodbc.Error: ('IM002', '[IM002] [unixODBC][Driver Manager]Data source name not found, and no default driver specified (0) (SQLDriverConnect)')

Short of posting my configuration here, is there a working example available anywhere ?

ODBC Driver for IBM iSeries AS400 version V7R1

I am trying to connect to IBM iSeries using pyodbc, a Python interface for ODBC, on Linux Red Hat (Intel 64bit arch).

The following is what I have done:

  1. Installed gcc-c++ (needed to compile pyodbc), unixODBC, unixODBC-devel

  2. Installed iSeriesAccess-6.1.0-1.0.x86_64.rpm:

sudo rpm -ivh --nodeps
  1. Attached the driver to unixODBC
odbcinst -i -d -f /opt/ibm/iSeriesAccess/unixodbcregistration64
  1. Since I had some issues with libraries, I symbolic linked them:
sudo ln -s /opt/ibm/iSeriesAccess/lib64/libcwb* /usr/lib/
  1. I configured the ODBC Data Source to access the AS400's database:/etc/odbc.ini:
[AS400]Description = Database iSeriesDriver = iSeries Access ODBC DriverSystem = XXX.XXX.XXX.XXXPort = 8471UserID = XXXXXXPassword = XXXXXXNaming = 1Database = XXXXXXXXXConnectionType = 0CommitMode = 2BlockFetch = 1BlockSizeKB = 512
  1. I tested the connection:
> isql -v AS400+---------------------------------------+| Connected! || || sql-statement || help [tablename] || quit || |+---------------------------------------+SQL>

I was able to connect, but when running any SQL query I get Segmentation fault (core dumped).

By checking /var/log/messages I see this:

isql[5767]: segfault at 7f6fdb30020e ip 00007f6ed372ff71 sp 00007ffcce17d5f0 error 4 in[7f6ed369e000+11c000]

I would like to re-install a different driver (e.g iSeriesAccess-7.1.0-1.0x86_64.rpm), but I am not able to find it.

DB2 Integrity Checks and Exception Tables

I am working on planning a migration of a DB2 8.1 database from a horrible IBM encoding to UTF-8 to support further languages etc. I am encountering an issue that I am stuck on.

A few notes on this migration:

  1. We are using db2move to export and load the data and db2look to get the details fo the database (tablespaces, tables, keys etc).
  2. We found the loading process worked nicely with db2move import, however, the data takes 7 hours to load and this was unacceptable downtime when we actually complete the conversion on the main database.
  3. We are now using db2move load, which is much faster as it seems to simply throw the data in without integrity checks. Which leads to my current issue.

After completing the db2move load process, several tables are in a check pending state and require integrity checks. Integrity checks are done via the following:

set integrity for . immediate checked

This works for most tables, however, some tables give an error:

DB21034E The command was processed as an SQL statement because it was not a valid Command Line Processor command. During SQL processing it returned:

SQL3603N Check data processing through the SET INTEGRITY statement has found integrity violation involving a constraint with name "blah.SQL120124110232400". SQLSTATE=23514

The internets tell me that the solution to this issue is to create an exception table based on the actual table and tell the SET INTEGRITY command to send any exceptions to that table (as below):


NOW, here is the specific issue I am having! The above forces all the rows with issues to the specified exception table. Well that's just super, buuuuuut I can not lose data in this conversion, its simply unacceptable. The internets and IBM has a vague description of sending the violations to the exception tables and then "dealing with the data" that is in the exception table. Unfortunately, I am not clear what this means and I was hoping that some wise individual knows and could help me out and let me know how I can retrieve this data from these tables and place the data in the original/proper table rather than these exception tables.

Let me know if you have any questions. Thanks!

Export Data from IBM DB2 into an SQL-INSERT Skript using IBM Data Studio Client or another tool

I have here a running IBM DB2 Database. I would like to export data from some tables into an SQL-Insert Skript, for example for the table T1 with the following content:

---------------| Col1 | Col2 | ---------------| 1 | Foo |---------------| 2 | Bar |---------------

A script like

INSERT INTO T1 (Col1, Col2) VALUES(1, 'Foo');INSERT INTO T1 (Col1, Col2) VALUES(2, 'Bar');

should be generated. The tables I would like to export do not have any auto-generated columns, so no special logic to treat those separately is necessary.

I've been using IBM Data Studio Client to export a DDL, examine the data, etc., but I did not find any export functions to export into an SQL-INSERT Script (there are functions to export into a CSV, etc.).

Can someone please give me some hints about a tool that could do this job, or tell me where in IBM Data Studio I could do this export?

Monitor SQL statements in DB2

I need to monitor SQL statements issued to DB2 database. I found the following article and I can to indeed capture SQL statements.

The problem is that prepared SQL statements still hold question marks. It there a way to get the final version of SQL statements?

DB2 version: 10.1.3


Db2 backup only transaction log files

What is the possibility to take transaction log (Archive log) backup in Db2 database.

I know that online backup takes archive log backup by default but what If I want the archive log backup after few hours of online incremental backup.

Is it possible to build a queue sharing group on only a coupling facility and not use db2?

I keep reading docs that say the messages are kept in a cf and can be offloaded to smds or db2 if they are too large. This would suggest I could build out queues in a cf and offload large messages to disk and avoid db2 all together. Then other docs say the queue objects are stored in db2 and db2 is necessary. Is it possible to build a QSG without db2?

DB2 10.5 HADR read only standby applications don't reconnect to primary


server SERV_A, database DBNAME primary

server SERV_B, database DBNAME standby with DB2_HADR_ROS enabled

Then this situation occurs:

  1. connection CON is made from client to DBNAME when primary is on SERV_A
  2. takeover DBNAME to SERV_B -> DBNAME becomes primary on SERV_B
  3. connection is rerouted with ACR (Automatic Client Reroute) to SERV_B
  4. takeover DBNAME back to SERV_A -> DBNAME becomes primary on SERV_A
  5. Connection CON does not goes back to SERV_A but remains connected to SERV_B in readolny mode.

How to avoid this situation? The active connection remains on standby database in read only mode until you restart the connection. Even worse with some apps which are using connection pools (Websphere Application Server) when you have to restart the entire application server to force the connection pool to first connect to primary server.

This occurs with ibm db2 dsdriver with ACR configured, jdbc driver type 4.Tested on multiple versions (fixpacks) of db2 10.5 and 11.

Recovering from a DB2 Failure

My system had a full hardware failure last night, I spent all day bring it back up. Long story short after a lot of "Why isn't XDB starting?" I ended up reviewing the logs and finding that the transactional logs are not available/corrupted.

Every time the DB2 database tries to start it comes up with a 'marked for shutdown' and never comes up.

My question is that I believe the DB2 instance is in a dirty state and can't come back up. Other than a backup file, are there ways to recover from such an issue?