Articles & Content


ServerFault is a Q&A site for a community of system administrators and IT professionals

Note: The content of this blog is the opinion and thoughts of the blogger and does not necessarily represent the opinions of IDUG.


how to find cause of shutdown

I have a HP proliant server with centOS 6.9 as OS and DB2 10.5.5 as database. these errors were in diag:

 1. A fatal error occurred in data protection services 2. An unexpected and critical error has occurred: "ForceDBShutdown" 3. The database manager has shut down the following database because a severe error has occurred 4. ForceDBShutdown : success 5. Logging can not continue due to an error 6. The database was damaged, so all applications processing the database were stopped 

how can i find the cause of database damage. i mean it just say that something happened.

DB2 Integrity Checks and Exception Tables

I am working on planning a migration of a DB2 8.1 database from a horrible IBM encoding to UTF-8 to support further languages etc. I am encountering an issue that I am stuck on.

A few notes on this migration:

  1. We are using db2move to export and load the data and db2look to get the details fo the database (tablespaces, tables, keys etc).
  2. We found the loading process worked nicely with db2move import, however, the data takes 7 hours to load and this was unacceptable downtime when we actually complete the conversion on the main database.
  3. We are now using db2move load, which is much faster as it seems to simply throw the data in without integrity checks. Which leads to my current issue.

After completing the db2move load process, several tables are in a check pending state and require integrity checks. Integrity checks are done via the following:

set integrity for . immediate checked

This works for most tables, however, some tables give an error:

DB21034E The command was processed as an SQL statement because it was not a valid Command Line Processor command. During SQL processing it returned:

SQL3603N Check data processing through the SET INTEGRITY statement has found integrity violation involving a constraint with name "blah.SQL120124110232400". SQLSTATE=23514

The internets tell me that the solution to this issue is to create an exception table based on the actual table and tell the SET INTEGRITY command to send any exceptions to that table (as below):


NOW, here is the specific issue I am having! The above forces all the rows with issues to the specified exception table. Well that's just super, buuuuuut I can not lose data in this conversion, its simply unacceptable. The internets and IBM has a vague description of sending the violations to the exception tables and then "dealing with the data" that is in the exception table. Unfortunately, I am not clear what this means and I was hoping that some wise individual knows and could help me out and let me know how I can retrieve this data from these tables and place the data in the original/proper table rather than these exception tables.

Let me know if you have any questions. Thanks!

ansible - Run "command db2 update database manager configuration using svcename db2c_db2inst1" as db2 user

I am setting up ansible to install a DB2 on a linux server. Everything is working except in the last step I need to run:

db2 update database manager configuration using svcename db2c_db2inst1

However, I cannot seem to run that as a unprivileged user (I can run it as db2isnt1 user from cmd line and it works). The task I am using looks like this:

 tasks: - name: setup svcename db2c_db2inst1 remote_user: db2inst1 shell: db2 update database manager configuration using svcename db2c_db2inst1

but I get the following error:

TASK [setup svcename db2c_db2inst1] ******************************************** fatal: [db2ansible]: FAILED! => {"changed": true, "cmd": "db2 update database manager configuration using svcename db2c_db2inst1", "delta": "0:00:00.003631", "end": "2017-02-13 16:39:38.301753", "failed": true, "rc": 127, "start": "2017-02-13 16:39:38.298122", "stderr": "/bin/sh: 1: db2: not found", "stdout": "", "stdout_lines": [], "warnings": []}

Any suggestions?

Thank you.


JVM is crashing in a call of RUNJVA from a CL program on AS400 machine

Calling a runnbale jar from a CL program by using RUNJVA command two times with different parameters as,

RUNJVA     CLASS('/MYFOLDER/JAVA/project.jar') +               PARM('INIT' '' 'TESTLIB') +               OUTPUT(* *CONTINUE)   RUNJVA     CLASS('/MYFOLDER/JAVA/project.jar') +               PARM('CLOSE' '' 'TESTLIB') +               OUTPUT(* *CONTINUE)

the first call finishes successfully; and the second call started but terminated soon without giving a exception in log file.NOTE: the code is surrounded in a try-catch(Throwable) block.

Important point: the JVM crashing is occurring at the point where I create the DB2 connection as

connection = DriverManager.getConnection("jdbc:db2:*local;translate binary=true;prompt=false;naming=sql;libraries=TESTLIB");

or sometime, when creating the AS400 object as server = new; 

Any help will be appreciated.

DB2 10.5 HADR read only standby applications don't reconnect to primary


server SERV_A, database DBNAME primary

server SERV_B, database DBNAME standby with DB2_HADR_ROS enabled

Then this situation occurs:

  1. connection CON is made from client to DBNAME when primary is on SERV_A
  2. takeover DBNAME to SERV_B -> DBNAME becomes primary on SERV_B
  3. connection is rerouted with ACR (Automatic Client Reroute) to SERV_B
  4. takeover DBNAME back to SERV_A -> DBNAME becomes primary on SERV_A
  5. Connection CON does not goes back to SERV_A but remains connected to SERV_B in readolny mode.

How to avoid this situation? The active connection remains on standby database in read only mode until you restart the connection. Even worse with some apps which are using connection pools (Websphere Application Server) when you have to restart the entire application server to force the connection pool to first connect to primary server.

This occurs with ibm db2 dsdriver with ACR configured, jdbc driver type 4.Tested on multiple versions (fixpacks) of db2 10.5 and 11.

How to track db2 database changes from linux without triggers or db modification

I've got an assignment where I need to create a shell script to track just the changes to the employees table (DB2 database) record for any insert, delete or update that the HR department perform to users hourly.

No need for SQL or bash code. Just ideas on how to get this done.

Cons: I cannot edit or alter any DB schema or add/create any trigger.

Pros: I have the credentials to select * from the table

Is there any way I can achieve this without bringing all db records and compare them?

I just only need the new changes (update, insert or delete).

PS: I DO have a successful DB connection and already performing select queries.

Thanks for the time to look at the post.


Why does database of db2 HADR standby server deactivate automatically?

Im trying to configure SAP HADR with the primary and standby server. Configurations seems to be ok but the database of standby server deactivates automatically. When i run the command db2 activate database in the standby server it will successfully activated but after a few seconds database gets deactivated again. And with this scenario, nearsync of logs is not consistent. Please advice what would be the possible cause of deactivation of database.

SAP ECC6.0 DB6 9.1

Unable to install php_ibm_db2.dll on PHP 5.6

I'm getting the following error when trying to start Apache. I've confirmed the extension is in the folder and other extensions such as sqlsrv work just fine. This is the only one that seems to fail on load. I'm running 32-bit PHP and Apache.

extension download

[17-Oct-2017 16:45:44 UTC] PHP Warning: PHP Startup: Unable to load dynamic library 'c:/wamp/bin/php/php5.6.25/ext/php_ibm_db2_nts.dll' - The specified module could not be found. in Unknown on line 0[17-Oct-2017 16:45:44 UTC] PHP Warning: PHP Startup: Unable to load dynamic library 'c:/wamp/bin/php/php5.6.25/ext/php_ibm_db2_ts.dll' - The specified module could not be found. in Unknown on line 0
Export Data from IBM DB2 into an SQL-INSERT Skript using IBM Data Studio Client or another tool

I have here a running IBM DB2 Database. I would like to export data from some tables into an SQL-Insert Skript, for example for the table T1 with the following content:

---------------| Col1 | Col2 | ---------------| 1 | Foo |---------------| 2 | Bar |---------------

A script like

INSERT INTO T1 (Col1, Col2) VALUES(1, 'Foo');INSERT INTO T1 (Col1, Col2) VALUES(2, 'Bar');

should be generated. The tables I would like to export do not have any auto-generated columns, so no special logic to treat those separately is necessary.

I've been using IBM Data Studio Client to export a DDL, examine the data, etc., but I did not find any export functions to export into an SQL-INSERT Script (there are functions to export into a CSV, etc.).

Can someone please give me some hints about a tool that could do this job, or tell me where in IBM Data Studio I could do this export?

DB2 and Linux remote server replication

I'm using DB2 10.5 and SLES 11 SP4.

My Question is, how would I replicate a remote server's db without having to ssh in and manually export/import.