I'm trying to run a job which moves data from a set of tables in an Oracle connection to an Sql Server One. But I'm getting the following exception which causes the job stopping :
2017/04/04 11:00:56 - read from [DEMANDE].0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2017/04/04 11:00:56 - read from [DEMANDE].0 - Couldn't get row from result set
2017/04/04 11:00:56 - read from [DEMANDE].0 -
2017/04/04 11:00:56 - read from [DEMANDE].0 - Unable to get value 'Integer(38)' from database resultset, index 3
2017/04/04 11:00:56 - read from [DEMANDE].0 - Overflow Exception
2017/04/04 11:00:56 - read from [DEMANDE].0 -
2017/04/04 11:00:56 - read from [DEMANDE].0 -
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.core.database.Database.getRow(Database.java:2367)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.core.database.Database.getRow(Database.java:2337)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:145)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at java.lang.Thread.run(Thread.java:744)
2017/04/04 11:00:56 - read from [DEMANDE].0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
2017/04/04 11:00:56 - read from [DEMANDE].0 - Unable to get value 'Integer(38)' from database resultset, index 3
2017/04/04 11:00:56 - read from [DEMANDE].0 - Overflow Exception
2017/04/04 11:00:56 - read from [DEMANDE].0 -
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromResultSet(ValueMetaBase.java:4702)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.core.database.BaseDatabaseMeta.getValueFromResultSet(BaseDatabaseMeta.java:2091)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.core.database.DatabaseMeta.getValueFromResultSet(DatabaseMeta.java:2901)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.core.database.Database.getRow(Database.java:2359)
2017/04/04 11:00:56 - read from [DEMANDE].0 - ... 4 more
2017/04/04 11:00:56 - read from [DEMANDE].0 - Caused by: java.sql.SQLException: Overflow Exception
2017/04/04 11:00:56 - read from [DEMANDE].0 - at oracle.sql.NUMBER.toLong(NUMBER.java:371)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at oracle.jdbc.dbaccess.DBConversion.NumberBytesToLong(DBConversion.java:2915)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at oracle.jdbc.driver.OracleStatement.getLongValue(OracleStatement.java:4373)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at oracle.jdbc.driver.OracleResultSetImpl.getLong(OracleResultSetImpl.java:529)
2017/04/04 11:00:56 - read from [DEMANDE].0 - at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromResultSet(ValueMetaBase.java:4660)
What the index 4 is pointing to is a column which has the type Number(38,0) in the oracle schema, the corresponding type for Sql Server table created by Pentaho is decimal(38,0).
The problem is that the very same job is running smoothly for other connections.
I'm using the classes12.jar as jdbc driver. I've tried using ojdbc6.jar (the one which is certified to work with jdk6, jdk7 and jdk8) but could not establish the connection to the database (Oracle 9i)
I've tried upgrading Pentaho Kettle to the latest stable release (Pentaho 7.0). But it's still giving the same problem.
What could be causing this problem ?