Thomas Rahier

When you have to extract data from an Oracle database SSIS provides the Attunity Source component to do this. In one of my projects we have used this component for year, because it works reliably and really fast.

However, after a while we had the new situation where we had to connect to an Oracle database server that is not located in the same network as our SQL Server. After we were enabled to connect to the database (database user, firewall permissions etc.) we started working and of course we used the Attunity Source component as always.

Big surprise: The performance of data extraction was incredibly bad, some simple extracts needed hours and hours. We had no explanation for this, so we tried to find the reason. We checked the network and the firewalls, we tuned the Oracle SQL queries and so on. But we could not find a solution, so we accepted the low performance for the moment, because we could somehow live with that.

Over the progress of the project it turned out that the bad performance of data extraction would make life more and more difficult for us. So we invested some more time and we finally found the reason:
The problems were caused by the default settings of the „BatchSize“ property of the Attunity Source component.
SpeedUpAttunity01

The default setting of the component is „100“ which means that 100 records are loaded in one batch, then the next 100 and so on. As long as we had worked in the same network this was no problem, but with the new database on the new server in the new network it turned out that a batch of only 100 records causes too much network traffic to work performantly.

So I played with the „BatchSize“ to find out what could help us. I executed a Data Flow Task that extracted data from a table with over 100 columns and four million records.
SpeedUpAttunity05

It turned out that a higher setting improves performance, but you cannot set it as high as you want, probably because of limited memory, processor power etc. Eventually we decided to use 100000, because for us this provided sufficient performance and it worked reliably.

There are serveral ways to change the setting of „BatchSize“, I’ll show you three.

1. Open the Data Flow task, left click the Attunity Source component and change the value on the properties page:
SpeedUpAttunity02

2. Open the Data Flow task, right click the Attunity Source component, go to Advanced Editor tab Component properties and change the setting there:
SpeedUpAttunity03

3. On the Control Flow left click the Data Flow task and change the value directly on the properties page:
SpeedUpAttunity04

As you see I use the BatchSize of 100000 in all my examples. Please keep in mind that this value was appropriate in my current project. In other projects a lower oder higher value might be the right choice, please test carefully when you plan to change the default BatchSize property of your Attunity Source tasks.