Articles & Content


ServerFault is a Q&A site for a community of system administrators and IT professionals

Note: The content of this blog is the opinion and thoughts of the blogger and does not necessarily represent the opinions of IDUG.


DB2 Integrity Checks and Exception Tables

I am working on planning a migration of a DB2 8.1 database from a horrible IBM encoding to UTF-8 to support further languages etc. I am encountering an issue that I am stuck on.

A few notes on this migration:

  1. We are using db2move to export and load the data and db2look to get the details fo the database (tablespaces, tables, keys etc).
  2. We found the loading process worked nicely with db2move import, however, the data takes 7 hours to load and this was unacceptable downtime when we actually complete the conversion on the main database.
  3. We are now using db2move load, which is much faster as it seems to simply throw the data in without integrity checks. Which leads to my current issue.

After completing the db2move load process, several tables are in a check pending state and require integrity checks. Integrity checks are done via the following:

set integrity for . immediate checked

This works for most tables, however, some tables give an error:

DB21034E The command was processed as an SQL statement because it was not a valid Command Line Processor command. During SQL processing it returned:

SQL3603N Check data processing through the SET INTEGRITY statement has found integrity violation involving a constraint with name "blah.SQL120124110232400". SQLSTATE=23514

The internets tell me that the solution to this issue is to create an exception table based on the actual table and tell the SET INTEGRITY command to send any exceptions to that table (as below):


NOW, here is the specific issue I am having! The above forces all the rows with issues to the specified exception table. Well that's just super, buuuuuut I can not lose data in this conversion, its simply unacceptable. The internets and IBM has a vague description of sending the violations to the exception tables and then "dealing with the data" that is in the exception table. Unfortunately, I am not clear what this means and I was hoping that some wise individual knows and could help me out and let me know how I can retrieve this data from these tables and place the data in the original/proper table rather than these exception tables.

Let me know if you have any questions. Thanks!

Monitor SQL statements in DB2

I need to monitor SQL statements issued to DB2 database. I found the following article and I can to indeed capture SQL statements.

The problem is that prepared SQL statements still hold question marks. It there a way to get the final version of SQL statements?

DB2 version: 10.1.3


AS400 and Remote Commands

On the AS400 I want to remotely execute:

strpgrprg topgr(userx) message(ALERT!)

What options are available to me?

update: I want to run a AS400 program from outside of the AS400. I want to execute this command (or similar) from a windows batch, or a linux shell script.

I've found some info on how-to via FTP. Just haven't tried it out and still looking to see if it's the best way to do it.

ansible - Run "command db2 update database manager configuration using svcename db2c_db2inst1" as db2 user

I am setting up ansible to install a DB2 on a linux server. Everything is working except in the last step I need to run:

db2 update database manager configuration using svcename db2c_db2inst1

However, I cannot seem to run that as a unprivileged user (I can run it as db2isnt1 user from cmd line and it works). The task I am using looks like this:

 tasks: - name: setup svcename db2c_db2inst1 remote_user: db2inst1 shell: db2 update database manager configuration using svcename db2c_db2inst1

but I get the following error:

TASK [setup svcename db2c_db2inst1] ******************************************** fatal: [db2ansible]: FAILED! => {"changed": true, "cmd": "db2 update database manager configuration using svcename db2c_db2inst1", "delta": "0:00:00.003631", "end": "2017-02-13 16:39:38.301753", "failed": true, "rc": 127, "start": "2017-02-13 16:39:38.298122", "stderr": "/bin/sh: 1: db2: not found", "stdout": "", "stdout_lines": [], "warnings": []}

Any suggestions?

Thank you.


What is the "sysproc" for DB2?

I have to search the vulnerability "CVE-2016-8944".However I do not understand the "sysproc".What is like?When does it use?

I referenced link below.


Drop database on DB2 9.5 - SQL1035N The database is currently in use

I've never got this working the first time, but now I can't seem to do i at all.

There is a connection pool somewhere using the database, so trying to drop the database when an application is using the database should give this error. The problem is there are no connection to the database when I issue these commands:

db2 connect to mydatabasedb2 quiesce database immediate force connectionsdb2 connect resetdb2 drop database mydatabase

This allways give:

SQL1035N The database is currently in use. SQLSTATE=57019

running this command shows no connections/applications

DB2 list applications

I can even deactivate the database, but still can't drop it.

db2 => deactivate database mydatabaseDB20000I The DEACTIVATE DATABASE command completed successfully.db2 => drop database mydatabaseSQL1035N The database is currently in use. SQLSTATE=57019db2 =>

Anyone got any clues? I'm running the cmd-windows as the local administrator (windows 2008) and this is also the admin for DB2. The connectionpool-user cannot connect during quiesce-state.

Confused about database names and remote locations on AS400

I'm taked with buildig a web service that gets it's data from an old AS400 database server.

I'm trying to connect over nodejs using and i'm getting this error on most databases.

SQL30061N The database alias or database name "database name here" was not found at the remote node. SQLSTATE=08004

Except from one, that states:An attempt to connect to a host failed due to a missing DB2 Connect product

(that's more or less expected)

I noticed that the last one is the *local database, and the rest have some other remote name.enter image description hereI also noticed that here it's stated that i can only connect to the *local database, but it never explains why

I'm really confused, all these databases are "local" in the sense they are all hosted in the same physical machine. what does *local mean on a remote location and why i cant connect to the other databases if they are not *local?

DB2 and Linux remote server replication

I'm using DB2 10.5 and SLES 11 SP4.

My Question is, how would I replicate a remote server's db without having to ssh in and manually export/import.

DB2: How to retain backups if automatic backup is enabled

Is there a way to retain the backups or configure the interval when they are deleted?According the manuals it's not possible: "If backup to disk is selected, the automatic backup feature will regularly delete backup images from the directory specified in the automatic database backup configuration. Only the most recent backup image will be available at any given time, regardless of the number of full backups that are specified in the automatic backup policy file."

Sounds strange to me.

Export Data from IBM DB2 into an SQL-INSERT Skript using IBM Data Studio Client or another tool

I have here a running IBM DB2 Database. I would like to export data from some tables into an SQL-Insert Skript, for example for the table T1 with the following content:

---------------| Col1 | Col2 | ---------------| 1 | Foo |---------------| 2 | Bar |---------------

A script like

INSERT INTO T1 (Col1, Col2) VALUES(1, 'Foo');INSERT INTO T1 (Col1, Col2) VALUES(2, 'Bar');

should be generated. The tables I would like to export do not have any auto-generated columns, so no special logic to treat those separately is necessary.

I've been using IBM Data Studio Client to export a DDL, examine the data, etc., but I did not find any export functions to export into an SQL-INSERT Script (there are functions to export into a CSV, etc.).

Can someone please give me some hints about a tool that could do this job, or tell me where in IBM Data Studio I could do this export?