0

I am trying to accomplish a simple task of:

  1. Read a text file that contains SQL queries separated by ";".
  2. Perform each query in a separate transaction
  3. Be able to restart/resume the execution from the first query that failed.

I tried to do this with spring-batch.(version 2.1.8.RELEASE) The problems I am facing are as follows:

I failed to configure a simple out-of-the-box ItemReader that would read several lines up to ";" and aggregate them before passing to the ItemWriter.

I do not need a FieldSetMapper I simply do not have no fields. My input is a text file containing SQL queries. Each query can take 1 or more lines and the queries are separated by a semicolon. Seems like it is impossible to define an LineMapper inside a ItemReader without specifying a FieldSetMapper to a complex bean (I tried to set a BeanWrapperFieldSetMapper to map to java.lang.String but failed due the exception described below).

Question 1: why do I need a FieldSetMapper with the prototypeBeanName set to a complex object if all I want is to append all the lines in a single String before passing it to the ItemWriter? When I configure the fieldSetMapper's property prototypeBeanName to refer to the java.lang.String I get an Exception saying that the number of names is not equal to the number of values. I debugged the spring-batch code and found that there are 2 values mapped to a single name: name=%SOME_NAME% values={%MY_SQL_QUERY%, ";"} and the Exception is thrown by AbstractLineTokenizer.tokenize() line 123

Question 2: Can the line aggregation be achieved by using out-of-the box spring-batch ItemReaders if yes, how they need to be configured? Do I miss something in my configuration? (see below)

configuration:

<bean id="basicParamsIncrementer" class="com.linking.core.exec.control.BasicJobParamsIncrementer" />

<batch:job id="rwBatchJob" restartable="true" incrementer="basicParamsIncrementer">
    <batch:step id="processBatchStep">
        <batch:tasklet transaction-manager="transactionManager" >
            <batch:chunk reader="batchFileItemReader" writer="sqlBatchFileWriter" 
                   commit-interval="1" />
        </batch:tasklet>
    </batch:step>
</batch:job>


<bean id="batchFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader">
    <property name="resource"
        value="classpath:/bat.sql.txt" />
    <property name="recordSeparatorPolicy" ref="separatorPolicy" />

    <property name="lineMapper">
        <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
            <property name="lineTokenizer" ref="sqlTokenizer" />
            <property name="fieldSetMapper" ref="sqlFieldSetMapper" />
        </bean>
    </property>
</bean>

<bean id="separatorPolicy" class="com.linking.core.SemiColonRecordSeparatorPolicy" />

<bean id="sqlTokenizer" class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
    <property name="delimiter" value=";" />
    <property name="names" value="sql,eol"/>
</bean>

<bean id="sqlFieldSetMapper" class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
    <property name="prototypeBeanName" value="sql" />
</bean>


<bean id="sql" class="com.linking.core.model.SQL" scope="prototype" /> 

com.linking.core.model.SQL is a simple class having 2 members of type String: sql (for the query) and eol (for the ';') It is used to overcome the situation that the AbstractItemReader returns all the lines up to the separator (as a String) and also the separator (in my case ";") as values.

Question 3: In case the job that runs a bulk of SQL queries using just a single step reader-writer approach fails in the middle how can I resume/restart the job from the 1st failed query? Should I do it programatically by analyzing last run or can it be achieved by context configuration of the job-step and the related out-of-the-box spring-batch beans?

4

1 回答 1

0

对于问题 3,我会做几件事

  • 在项目阅读器中,您可以将“saveState”之类的属性传递给 true
  • 将“commit-interval”设置为合适的值
  • 配置Spring批处理DB,这样可以保存状态

如果以上都设置好了,并且你不使用“next”作为工作实例,它将从你离开的地方继续!

于 2012-04-20T21:20:04.890 回答