Showing posts with label source. Show all posts
Showing posts with label source. Show all posts

Tuesday, March 27, 2012

Database capacity

Hi,

I’m working in a DWH project the source system have data from passed 30 years and the total number of records from the source system is 10 million.

I’m need to do the estimation for the data base for the up coming 10 years of data.

As I can predict that the 30 year of data is 10 million then expected data for 10 years would not be more then 3 million.

The ETL tool I’m using is SQL Server Integration services and the reporting tool is Business Objects.

Can any one guide me on the implementation plan, meanmemory taken for ETL jobs (As SSIS use large amount of memory) database and business objects.

Sheikh

A good place to start is this link http://www.dmreview.com/article_sub.cfm?articleId=1048082

In talks in general about Capacity Planning.

Some basic things I would say -- almost a must have for a Datawarehouse

1) 64 Bit Environment -- you can use more than 4 GB of Memory in here (well you can use more than 4 GB in a 32 bit -- but 64 bit is better, especially if you can afford it).

2) Min of 8 GB of RAM. (considering you do not have more than 15 Million Rows).

3) Obviously -- enough Hard DIsk space for your DW.

4) A Good Disk Subsytem.

But a lot depends on your usage pattern and what you are trying to achieve. Ultimately it will all come down to cost vs benefit.

|||Thank you for the helpfull post.

Monday, March 19, 2012

Database attach failed

Here's the scoop, I got a generic-purpose database (in the form of
*.mdb and *.ldf) from a 'reputable' source on the net. But when I
attempted to attach it to my sql server 2000 with SP3 with EM, it
failed complaining "Could not find row in sysindexes for database ID
10, object ID 1, index ID 1. Run DBCC CHECKTABLE on sysindexes."
Then, I attempted to command line attach (thought it may have some
option...), same outcome. Then, ran a search on this NG, and found
the following thread, the question is, MS most likely would not
support something like this, so now what? Also thought about
manually adding a row to sysindexes table to 'fool the attach process'
but after looking at some sample data in this table, I don't think
it's a good idea to try, what can I do? Thanks.

http://groups.google.com/group/comp...3bee910fa30aa9atime management (tatata9999@.gmail.com) writes:

Quote:

Originally Posted by

Here's the scoop, I got a generic-purpose database (in the form of
*.mdb and *.ldf) from a 'reputable' source on the net. But when I
attempted to attach it to my sql server 2000 with SP3 with EM, it
failed complaining "Could not find row in sysindexes for database ID
10, object ID 1, index ID 1. Run DBCC CHECKTABLE on sysindexes."


That message appears familiar. I seem to recall that is what happens
if you try to attach an SQL 2005 database on SQL 2000. As far as
SQL 2000 that is a database that is alien, and for which it cannot
really have any graceful handling off.

Quote:

Originally Posted by

Then, I attempted to command line attach (thought it may have some
option...), same outcome. Then, ran a search on this NG, and found
the following thread,


Which is from 1998, and applies to really old versions of SQL Server.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||On Nov 19, 5:32 pm, Erland Sommarskog <esq...@.sommarskog.sewrote:

Quote:

Originally Posted by

time management (tatata9...@.gmail.com) writes:

Quote:

Originally Posted by

Here's the scoop, I got a generic-purpose database (in the form of
*.mdb and *.ldf) from a 'reputable' source on the net. But when I
attempted to attach it to my sql server 2000 with SP3 with EM, it
failed complaining "Could not find row in sysindexes for database ID
10, object ID 1, index ID 1. Run DBCC CHECKTABLE on sysindexes."


>
That message appears familiar. I seem to recall that is what happens
if you try to attach an SQL 2005 database on SQL 2000. As far as
SQL 2000 that is a database that is alien, and for which it cannot
really have any graceful handling off.
>

Quote:

Originally Posted by

Then, I attempted to command line attach (thought it may have some
option...), same outcome. Then, ran a search on this NG, and found
the following thread,


>
Which is from 1998, and applies to really old versions of SQL Server.
>
--
Erland Sommarskog, SQL Server MVP, esq...@.sommarskog.se
>
Books Online for SQL Server 2005 athttp://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books...
Books Online for SQL Server 2000 athttp://www.microsoft.com/sql/prodinfo/previousversions/books.mspx


Thanks for the follow-up, Erland. I tried to attach it with sql
server 2005 as well, the restored db seemed weired with "D:\aName
\blakdsm\blad..", empty, not workable. Now, suppose I can get a
clean/good copy for server 2005, what are the chances of success of
using DUMB database then BACKUP or LOAD back into server 2000? These
two boxes are not connected and they can't.

Don|||time management (tatata9999@.gmail.com) writes:

Quote:

Originally Posted by

Thanks for the follow-up, Erland. I tried to attach it with sql
server 2005 as well, the restored db seemed weired with "D:\aName
\blakdsm\blad..", empty, not workable. Now, suppose I can get a
clean/good copy for server 2005, what are the chances of success of
using DUMB database then BACKUP or LOAD back into server 2000? These
two boxes are not connected and they can't.


If you need to move a database from SQL 2005 to SQL 2000 you need to
create from scripts and copy data to file with bulk copy. If the database
uses features that do not exist in SQL 2000, you will have to make some
compromises.

You cannot restore a backup from SQL 2005 on SQL 2000. For quite obvious
reasons: there are features in SQl 2005 for which SQL 2000 is not prepared.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||On Nov 20, 5:31 pm, Erland Sommarskog <esq...@.sommarskog.sewrote:

Quote:

Originally Posted by

time management (tatata9...@.gmail.com) writes:

Quote:

Originally Posted by

Thanks for the follow-up, Erland. I tried to attach it with sql
server 2005 as well, the restored db seemed weired with "D:\aName
\blakdsm\blad..", empty, not workable. Now, suppose I can get a
clean/good copy for server 2005, what are the chances of success of
using DUMB database then BACKUP or LOAD back into server 2000? These
two boxes are not connected and they can't.


>
If you need to move a database from SQL 2005 to SQL 2000 you need to
create from scripts and copy data to file with bulk copy. If the database
uses features that do not exist in SQL 2000, you will have to make some
compromises.
>
You cannot restore a backup from SQL 2005 on SQL 2000. For quite obvious
reasons: there are features in SQl 2005 for which SQL 2000 is not prepared.
>
--
Erland Sommarskog, SQL Server MVP, esq...@.sommarskog.se
>
Books Online for SQL Server 2005 athttp://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books...
Books Online for SQL Server 2000 athttp://www.microsoft.com/sql/prodinfo/previousversions/books.mspx


Thanks, Erland, bcp is clumsy or because I'm not an expert of it :)
Viable options for the task all seem to consume quite a bit of time
but hey got to do the job. Once again I appreciate it.

Don

Saturday, February 25, 2012

Data warehouse or data mart

Hi,
I'm working on a project at the moment where have an existing warehouse, and
need to bring in data from new source systems, which are from a similar (but
not quite the same) part of the business. These two parts of the business
merged recently, and they want to have a consolidated view of the data, as
well as replace some existing (Excel and Access based) reporting.
We are coming to the end of the analysis and design phase, and I want to get
some opinions that the approach I'm taking is valid.
I have tried to bring everything into the one database. I've added some new
fact tables for new source systems, as the original and new source systems
don't quite work the same. I've also tried to conform the dimensions as far
as possible, so that the data will aggregate consistently for the management
reporting.
This seems to fit pretty well, and I haven't had to alter the current data
model very much.
My question is basically: Is this the right way to go, or am I better off
breaking the warehouse into separate data marts?
A lot of the reporting will only apply to each part of the business, but
there will be some management reporting that wants a consolidated view of the
new merged business.
Any comments will be appreciated.
Thanks.
Hi,
Based on my scope, there is no necessary to break them into seperate data
marts for this situation because they have similar model and need to be
merged in the same analysis report. You may consider to create different
cube and you could at least share some common dimensions such as
time/location etc.
Regards,
Peter Yang
MCSE2000/2003, MCSA, MCDBA
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.
--
>Thread-Topic: Data warehouse or data mart
>thread-index: AcXSOg1GLZ3yTwgwTGqbRewN1uy9bQ==
>X-WBNR-Posting-Host: 203.54.233.209
>From: "=?Utf-8?B?V3JlY2s=?=" <Wreck@.community.nospam>
>Subject: Data warehouse or data mart
>Date: Sun, 16 Oct 2005 03:12:02 -0700
>Lines: 31
>Message-ID: <EA2C0FDE-2A52-44DE-88F9-5BA83B6278BF@.microsoft.com>
>MIME-Version: 1.0
>Content-Type: text/plain;
>charset="Utf-8"
>Content-Transfer-Encoding: 7bit
>X-Newsreader: Microsoft CDO for Windows 2000
>Content-Class: urn:content-classes:message
>Importance: normal
>Priority: normal
>X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.0
>Newsgroups: microsoft.public.sqlserver.datawarehouse
>NNTP-Posting-Host: TK2MSFTNGXA03.phx.gbl 10.40.2.250
>Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGXA03.phx.gbl
>Xref: TK2MSFTNGXA01.phx.gbl microsoft.public.sqlserver.datawarehouse:2317
>X-Tomcat-NG: microsoft.public.sqlserver.datawarehouse
>Hi,
>I'm working on a project at the moment where have an existing warehouse,
and
>need to bring in data from new source systems, which are from a similar
(but
>not quite the same) part of the business. These two parts of the business
>merged recently, and they want to have a consolidated view of the data, as
>well as replace some existing (Excel and Access based) reporting.
>We are coming to the end of the analysis and design phase, and I want to
get
>some opinions that the approach I'm taking is valid.
>I have tried to bring everything into the one database. I've added some
new
>fact tables for new source systems, as the original and new source systems
>don't quite work the same. I've also tried to conform the dimensions as
far
>as possible, so that the data will aggregate consistently for the
management
>reporting.
>This seems to fit pretty well, and I haven't had to alter the current data
>model very much.
>My question is basically: Is this the right way to go, or am I better off
>breaking the warehouse into separate data marts?
>A lot of the reporting will only apply to each part of the business, but
>there will be some management reporting that wants a consolidated view of
the
>new merged business.
>Any comments will be appreciated.
>Thanks.
>
|||Hi Peter,
That's pretty much the approach I'm taking. The new source systems will have
their own fact tables, with conformed measures, joining to conformed
dimensions.
I'll build separate cubes for the fact tables, and put the old and new cubes
in a single virtual cube, joined on the common dimensions.
Thanks,
Wreck.
"Peter Yang [MSFT]" wrote:

> Hi,
> Based on my scope, there is no necessary to break them into seperate data
> marts for this situation because they have similar model and need to be
> merged in the same analysis report. You may consider to create different
> cube and you could at least share some common dimensions such as
> time/location etc.
> Regards,
> Peter Yang
> MCSE2000/2003, MCSA, MCDBA
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ================================================== ===
>
> This posting is provided "AS IS" with no warranties, and confers no rights.
>
> --
> and
> (but
> get
> new
> far
> management
> the
>

Data warehouse or data mart

Hi,
I'm working on a project at the moment where have an existing warehouse, and
need to bring in data from new source systems, which are from a similar (but
not quite the same) part of the business. These two parts of the business
merged recently, and they want to have a consolidated view of the data, as
well as replace some existing (Excel and Access based) reporting.
We are coming to the end of the analysis and design phase, and I want to get
some opinions that the approach I'm taking is valid.
I have tried to bring everything into the one database. I've added some new
fact tables for new source systems, as the original and new source systems
don't quite work the same. I've also tried to conform the dimensions as far
as possible, so that the data will aggregate consistently for the management
reporting.
This seems to fit pretty well, and I haven't had to alter the current data
model very much.
My question is basically: Is this the right way to go, or am I better off
breaking the warehouse into separate data marts?
A lot of the reporting will only apply to each part of the business, but
there will be some management reporting that wants a consolidated view of th
e
new merged business.
Any comments will be appreciated.
Thanks.Hi,
Based on my scope, there is no necessary to break them into seperate data
marts for this situation because they have similar model and need to be
merged in the same analysis report. You may consider to create different
cube and you could at least share some common dimensions such as
time/location etc.
Regards,
Peter Yang
MCSE2000/2003, MCSA, MCDBA
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.
--
>Thread-Topic: Data warehouse or data mart
>thread-index: AcXSOg1GLZ3yTwgwTGqbRewN1uy9bQ==
>X-WBNR-Posting-Host: 203.54.233.209
>From: "examnotes" <Wreck@.community.nospam>
>Subject: Data warehouse or data mart
>Date: Sun, 16 Oct 2005 03:12:02 -0700
>Lines: 31
>Message-ID: <EA2C0FDE-2A52-44DE-88F9-5BA83B6278BF@.microsoft.com>
>MIME-Version: 1.0
>Content-Type: text/plain;
> charset="Utf-8"
>Content-Transfer-Encoding: 7bit
>X-Newsreader: Microsoft CDO for Windows 2000
>Content-Class: urn:content-classes:message
>Importance: normal
>Priority: normal
>X-MimeOLE: Produced By Microsoft MimeOLE V6.00.3790.0
>Newsgroups: microsoft.public.sqlserver.datawarehouse
>NNTP-Posting-Host: TK2MSFTNGXA03.phx.gbl 10.40.2.250
>Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGXA03.phx.gbl
>Xref: TK2MSFTNGXA01.phx.gbl microsoft.public.sqlserver.datawarehouse:2317
>X-Tomcat-NG: microsoft.public.sqlserver.datawarehouse
>Hi,
>I'm working on a project at the moment where have an existing warehouse,
and
>need to bring in data from new source systems, which are from a similar
(but
>not quite the same) part of the business. These two parts of the business
>merged recently, and they want to have a consolidated view of the data, as
>well as replace some existing (Excel and Access based) reporting.
>We are coming to the end of the analysis and design phase, and I want to
get
>some opinions that the approach I'm taking is valid.
>I have tried to bring everything into the one database. I've added some
new
>fact tables for new source systems, as the original and new source systems
>don't quite work the same. I've also tried to conform the dimensions as
far
>as possible, so that the data will aggregate consistently for the
management
>reporting.
>This seems to fit pretty well, and I haven't had to alter the current data
>model very much.
>My question is basically: Is this the right way to go, or am I better off
>breaking the warehouse into separate data marts?
>A lot of the reporting will only apply to each part of the business, but
>there will be some management reporting that wants a consolidated view of
the
>new merged business.
>Any comments will be appreciated.
>Thanks.
>|||Hi Peter,
That's pretty much the approach I'm taking. The new source systems will have
their own fact tables, with conformed measures, joining to conformed
dimensions.
I'll build separate cubes for the fact tables, and put the old and new cubes
in a single virtual cube, joined on the common dimensions.
Thanks,
Wreck.
"Peter Yang [MSFT]" wrote:

> Hi,
> Based on my scope, there is no necessary to break them into seperate data
> marts for this situation because they have similar model and need to be
> merged in the same analysis report. You may consider to create different
> cube and you could at least share some common dimensions such as
> time/location etc.
> Regards,
> Peter Yang
> MCSE2000/2003, MCSA, MCDBA
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ========================================
=============
>
> This posting is provided "AS IS" with no warranties, and confers no rights
.
>
> --
> and
> (but
> get
> new
> far
> management
> the
>

Friday, February 24, 2012

Data types question - Varbinary and timestamps

Greetings once again SSIS friends,

I have some source tables which contain timestamp fields (that's timestamp data type not datetime). My dimension table holds the maximum timestamp value as a varbinary(8).

I want my package to have a variable that holds that value but I don't know which data type to use for this. The reason for this is because I want to use that variable to then retrieve all records from my source table that have a timestamp value greater than the value stored in the variable.

Please advise on what data type is suitable.

Thanks for your help in advance.

Greetings once again guys,

I found the following article which gives a workable solution to my problem, but I am still interested to find out the equivalent of timestamp data type in SSIS.

http://solidqualitylearning.com/blogs/erik/archive/2005/12/09/1499.aspx

|||

DT_BYTES

binary, varbinary, timestamp

binary, varbinary, timestamp

BigBinary, VarBinary

RAW

RAW

You will want to use DT_BYTES. See this article on MSDN.

http://msdn2.microsoft.com/en-us/library/ms141036.aspx

Does this answer your question?

Sunday, February 19, 2012

Data type problem

Greetings SSIS friends,

I cast one of my fields from a decimal to an int in my data source (generated from a SQL command). But when I try and map that field to another one of my tables, i get an error saying that I can not cvonvert from decimal to Int32.

Is there any solution which doest require using the derived column or data conversion components?!

Your help would be appreciated.

Can you provide your SQL so that we can look at it? If you are casting to an integer from a decimal in the SQL, SSIS should only see an integer...|||

Hi Phil,

That's what I don't understand.

The SQL is pretty straightforward to be honest.

select
distinct
b.eventid as evt_eventid,
b.meetid as evt_meetid,
cast(h.longname as varchar) as evt_long_name,
cast(h.shortname as varchar) as evt_source_short_name,
b.expstartdate as evt_expstartdate,
b.expstarttime as evt_expstarttime,
b.eventnumber as evt_eventnumber,
b.expertgroup as evt_expertgroup,
b.currentrun as evt_currentrun,
b.enddate as evt_enddate,
b.endtime as evt_endtime,
b.eventstatus as evt_eventstatus,
f.etypeid evt_etypeid,
cast(g.longname as varchar) as evt_type_long_name,
cast(h.longname as varchar) as evt_name,
b.maxplaces as evt_maxplaces,
b.maxpayout as evt_specific_maxpayout,
f.maxpayout as evt_default_type_maxpayout,
b.offdate as evt_offdate,
b.offtime as evt_offtime,
cast(b.callitap as char(1)) as evt_callitap,
cast(isnull(b.maxstake, 0) as int) as evt_maxstake,
b.scoop6leg as evt_scoop6leg,
b.riskoff as evt_riskoff,
b.userofftime as evt_userofftime,
b.delayofftime as evt_delayofftime,
cast(b.suspendliability as int) as evt_suspendliability,
b.termsid as evt_specific_termsid,
i.termsid as evt_default_termsid,
cast(j.longname as varchar) as evt_default_terms,
cast(l.longname as varchar) as specific_evt_terms,
convert(varchar(20), convert(bigint, b.updatetimestamp)) as evt_updatetimestamp
from meetings as a
inner join events as b
on a.meetid = b.meetid
inner join names as c
on a.nameid = c.nameid
inner join EvenType as d
on a.etypeid = d.etypeid
inner join names as e
on d.nameid = e.nameid
inner join EvenType as f
on b.etypeid = f.etypeid
inner join names as g
on f.nameid = g.nameid
inner join names as h
on b.nameid = h.nameid
inner join terms as i
on f.termsid = i.termsid
inner join names as j
on i.nameid = j.nameid
left join terms as k
on b.termsid = k.termsid
left join names as l
on k.nameid = l.nameid
where
convert(varchar(20), convert(bigint, a.updatetimestamp)) > ?
or convert(varchar(20), convert(bigint, b.updatetimestamp)) > ?

order by b.meetid, b.eventid

In SSIS however, it still thinks the column is a decimal.

|||Did you previously have the decimal coming into the source and then later changed the SQL to cast it to an integer?

In the source column mappings, delete the row that corresponds to the column you are working with, and re-map it. Does that help?|||

Hi Phil,

Yes you are right. I unmapped the column and then remaped it and reconfigured my other merge joins and it is now in the data type that I cast it in the SQL code.

Thanks for your help.

Data Type in Oracle is not translating in SSIS datatype

My OLDB data source in the Lookup Transformation is not converting

When I try to join the colunms it claims the data types do not match the only differnece is the percision.?

ORACLE
select CAST(group_id AS NUMBER(18)) group_id from blah blah table
SSIS data type = [DT_NUMERIC], Percision = 38

SQL Sever data source
SELECT
CAST(PRINCIPAL_ID as numeric) GROUP_ID,
SSIS data type = [DT_NUMERIC], Percision = 18The metadata of the columns being joined should have to match exactly. (same precision, same

scale)

Data type error

I have created a Foreach Loop container which generates a string variable which is a Select statement that in turn used in my OLE DB source in my Data Flow task. I have a 3 variables that I am using to create the Select statement. Two of them work fine but the 3rd gives me an error "Cannot convert varchar to numeric" after about the 5th or 6th loop which is odd as the variable is the same for each pass.

The SQL Task linked to the Foreach Loop is a query as follows

SELECT DISTINCT

CAST(FYr as varchar(4)) as FYr,

CAST(Acct1 as varchar(13)) as Acct1,

CAST(Acct2 as varchar(13)) as Acct2

FROM GL, AcctTbl

The resulting dataset looks like this (there is only one record in AcctTbl)

FYr Acct1 Acct2

2000 400.00 307.00

2001 400.00 307.00

2002 400.00 307.00

etc, which is exactly what I want.

I've created package scope string variables for sFYr, sAcct1 and sAcct2 as well as another variable qrySQL

The value for qrySQL string variable is something like this

"Select " + @.User :: sFYr + " as FiscalYear, Account, " + @.User :: sAcct2 + "Amount FROM tblGL WHERE Account < " + @.User :: sAcct1

This is then used as my datasourse in my data flow task

When I run the package it goes through a half a dozen iterations ( there are about a dozen total rows to iterate) and successfully writes the results to my data destination table but then fails with the "cannot convert varchar to numeric" message.

It seems to be with the sAcct1 variable because if I use the same string for my qrySQL except I replace the sAcct1 variable with string (as shown below) the package completes successfully

"Select " + @.User :: sFYr + " as FiscalYear, Account, " + @.User :: sAcct2 + "Amount FROM tblGL WHERE Account < '400.00'

Does anyone have any ideas? Can I not use the < to compare string? The Account field that I'm comparing is a varchar(13) field. I've even tried casting the Account and sAcct1 variable as numeric in the qrySQL string and I'm getting the same failure after several iterations.

Any insight would really be appreciated. I've lost a bit of hair over this one.

Thanks in advance

You should be including single quotes around sAcct1 in the WHERE clause, such as "...WHERE Account < '" + @.User :: sAcct1 + "'"

I believe SQL Server is converting the value of your Account column to a numeric to match the datatype you are sending it. The error is occurring because you have data in that column that fails the conversion.
|||

wpwebster wrote:

I have created a Foreach Loop container which generates a string variable which is a Select statement that in turn used in my OLE DB source in my Data Flow task. I have a 3 variables that I am using to create the Select statement. Two of them work fine but the 3rd gives me an error "Cannot convert varchar to numeric" after about the 5th or 6th loop which is odd as the variable is the same for each pass.

The SQL Task linked to the Foreach Loop is a query as follows

SELECT DISTINCT

CAST(FYr as varchar(4)) as FYr,

CAST(Acct1 as varchar(13)) as Acct1,

CAST(Acct2 as varchar(13)) as Acct2

FROM GL, AcctTbl

The resulting dataset looks like this (there is only one record in AcctTbl)

FYr Acct1 Acct2

2000 400.00 307.00

2001 400.00 307.00

2002 400.00 307.00

etc, which is exactly what I want.

I've created package scope string variables for sFYr, sAcct1 and sAcct2 as well as another variable qrySQL

The value for qrySQL string variable is something like this

"Select " + @.User :: sFYr + " as FiscalYear, Account, " + @.User :: sAcct2 + "Amount FROM tblGL WHERE Account < " + @.User :: sAcct1

This is then used as my datasourse in my data flow task

When I run the package it goes through a half a dozen iterations ( there are about a dozen total rows to iterate) and successfully writes the results to my data destination table but then fails with the "cannot convert varchar to numeric" message.

It seems to be with the sAcct1 variable because if I use the same string for my qrySQL except I replace the sAcct1 variable with string (as shown below) the package completes successfully

"Select " + @.User :: sFYr + " as FiscalYear, Account, " + @.User :: sAcct2 + "Amount FROM tblGL WHERE Account < '400.00'

Does anyone have any ideas? Can I not use the < to compare string? The Account field that I'm comparing is a varchar(13) field. I've even tried casting the Account and sAcct1 variable as numeric in the qrySQL string and I'm getting the same failure after several iterations.

Any insight would really be appreciated. I've lost a bit of hair over this one.

Thanks in advance

Is it possible that Account column in tblGL table is numeric? if so, you make sure that you cast accordingly the values of acct1 from Acttbl table. Why are you casting it in the query as varchar and putting it in a string variable? would not be better to to use a data type that is consistent with tblGL.Account?

|||

Thanks for the input. It helped me get to the bottom of it. The Account field is a varchar(13) field although the accounts are in a format of something like 400.00 There was however several records I found where the Accocunt was NA, when I changed those through a derived column data flow control to a "0" it worked fine.

I still find it a bit puzzling that the WHERE clause worked when it was WHERE Account < '400.00' but wouldnt' work when it was WHERE Account < @.UserVariable

Making sure all Accounts looked like numbers did the trick though.

Thanks again for the input.

Regards

Bill

Friday, February 17, 2012

Data type conversions

What are the pros/cons of datatype conversions using the advanced properties of the source adapters or transformation components themselves vs. using the data conversion transformation? Sometimes I am able to do all the conversions I need using the advanced properties rather than use a data conversion component, but didn't know if this was considered bad practice, or has some other impact that isn't immediately apparent.

Thanks

Kory

I offten use Derived Column component as data conversion

I think that is well used to convert lots of data type at the same time

|||

I'd say doing as much conversion up front in a source as possible would be preferable. Doing it later on means you have a wider buffer, to accomodate the two columns required. Saying that, they can be harder to maintain sometimes, as refreshing the metadata can loose your little tweaks. Depends on how stable you view your source perhaps.

Using one transformation over another, Derived Column or Data Conversion would not seem like much of a choice really, I doubt there is much to choose between. They are both synchronous transforms, so will require two "columns" in the buffer, the current type and the new type. That obviously makes for a wider buffer and therefore more cost, due to less rows per buffer.

Still do what works for you, but test a few scenarios if you can and see which works best for you. If performance or maintanece a bigger cost to you?

Tuesday, February 14, 2012

Data Transformation.

Hello,
I want to do some data transformation from one Database to another.
My source DB is a MS SQLServer to which I can connect using OLEDB/SQLnative.
My destination DB is a different type of database to which I can connect
using Connection Manager. But for data transformation I need a destination
connection. In the data pump area I don't have the option to choose an ODBC
connection type. How can I connect to my destination database if there is no
ODBC connection type to choose from in the Dataflow Tab?
Help.
Regards,
Sami
[Remove Numbers from e-mail address to use it]I am using SQLBase from Gupta. The only way to connect to the Database is
over ODBC. But as I mentioned, if I choose to do a Data Transformation I
don't have the option of choosing ODBC as my destination connection.
Regards,
Sami
[Remove Numbers from e-mail address to use it]
"Allan Mitchell" <allan@.no-spam.sqldts.com> schrieb im Newsbeitrag
news:f0343b032956e8c851df501af0f0@.msnews
.microsoft.com...
> Hello Sami,
>
> And what database type is your destination?
> You may find this post helpful
> http://forums.microsoft.com/MSDN/Sh...=70469&SiteID=1
>
> Allan Mitchell
> www.SQLDTS.com
> www.SQLIS.com
> www.Konesans.com
>
>|||Sami,
Create a .Net Providers/ODBC data providers connection and configure it to
use a system DSN that connects to your SQLBase. Within the dataflow, create
a DataReader Destination and configure it to point the data connection that
you created. On the second tab, you will have to supply the SQLCommand to
write the data.
Hope this helps.
-Steven
"Sami" wrote:

> I am using SQLBase from Gupta. The only way to connect to the Database is
> over ODBC. But as I mentioned, if I choose to do a Data Transformation I
> don't have the option of choosing ODBC as my destination connection.
> --
> Regards,
> Sami
> [Remove Numbers from e-mail address to use it]
> "Allan Mitchell" <allan@.no-spam.sqldts.com> schrieb im Newsbeitrag
> news:f0343b032956e8c851df501af0f0@.msnews
.microsoft.com...
>|||Thank you very much for all your help.
Regards,
Sami
[Remove Numbers from e-mail address to use it]
"Sami" <s8a2m9i1_i5s1l9a6m@.hotmail.com> schrieb im Newsbeitrag
news:OPXbc4ygGHA.1260@.TK2MSFTNGP05.phx.gbl...
> Hello,
> I want to do some data transformation from one Database to another.
> My source DB is a MS SQLServer to which I can connect using
OLEDB/SQLnative.
> My destination DB is a different type of database to which I can connect
> using Connection Manager. But for data transformation I need a destination
> connection. In the data pump area I don't have the option to choose an
ODBC
> connection type. How can I connect to my destination database if there is
no
> ODBC connection type to choose from in the Dataflow Tab?
> Help.
> --
> Regards,
> Sami
> [Remove Numbers from e-mail address to use it]
>|||Thank you very much for all your help.
Regards,
Sami
[Remove Numbers from e-mail address to use it]
"Sami" <s8a2m9i1_i5s1l9a6m@.hotmail.com> schrieb im Newsbeitrag
news:OPXbc4ygGHA.1260@.TK2MSFTNGP05.phx.gbl...
> Hello,
> I want to do some data transformation from one Database to another.
> My source DB is a MS SQLServer to which I can connect using
OLEDB/SQLnative.
> My destination DB is a different type of database to which I can connect
> using Connection Manager. But for data transformation I need a destination
> connection. In the data pump area I don't have the option to choose an
ODBC
> connection type. How can I connect to my destination database if there is
no
> ODBC connection type to choose from in the Dataflow Tab?
> Help.
> --
> Regards,
> Sami
> [Remove Numbers from e-mail address to use it]
>

Data Transformation - hanging

I'm running a DTS package on SQL Server. The source is MS Access and the
target is Oracle.

On a "Drop Table" command the process just hangs. There are no foreign keys on
the table. Several tables have already been processed successfully by this
time.

I think I've ruled out corruption by dropping and recreating the target
database on Oracle.

Any ideas?

M ManHi

There may be some lock on the table from a different process. I assume that
you are using this as a temporary table, therefore why not create these in
your SQL Server database rather than the destination one?

John

"MMan37x" <mman37x@.cs.com> wrote in message
news:20041015203641.02596.00002058@.mb-m13.news.cs.com...
> I'm running a DTS package on SQL Server. The source is MS Access and the
> target is Oracle.
> On a "Drop Table" command the process just hangs. There are no foreign
keys on
> the table. Several tables have already been processed successfully by
this
> time.
> I think I've ruled out corruption by dropping and recreating the target
> database on Oracle.
> Any ideas?
> M Man