Tuesday, March 27, 2012

Database capacity

Hi,

I’m working in a DWH project the source system have data from passed 30 years and the total number of records from the source system is 10 million.

I’m need to do the estimation for the data base for the up coming 10 years of data.

As I can predict that the 30 year of data is 10 million then expected data for 10 years would not be more then 3 million.

The ETL tool I’m using is SQL Server Integration services and the reporting tool is Business Objects.

Can any one guide me on the implementation plan, meanmemory taken for ETL jobs (As SSIS use large amount of memory) database and business objects.

Sheikh

A good place to start is this link http://www.dmreview.com/article_sub.cfm?articleId=1048082

In talks in general about Capacity Planning.

Some basic things I would say -- almost a must have for a Datawarehouse

1) 64 Bit Environment -- you can use more than 4 GB of Memory in here (well you can use more than 4 GB in a 32 bit -- but 64 bit is better, especially if you can afford it).

2) Min of 8 GB of RAM. (considering you do not have more than 15 Million Rows).

3) Obviously -- enough Hard DIsk space for your DW.

4) A Good Disk Subsytem.

But a lot depends on your usage pattern and what you are trying to achieve. Ultimately it will all come down to cost vs benefit.

|||Thank you for the helpfull post.

No comments:

Post a Comment