Showing posts with label multiple. Show all posts
Showing posts with label multiple. Show all posts

Tuesday, March 27, 2012

Database Collation: Chinese, Japanese, Korean

My application supports multiple languages/locales in a single database. Some of our new customers want to support Chinese, Japanese, Korean, Italian, Spanish, and German in addition to English. Supporting the Latin based languages is not a problem. But I am having trouble finding a collation sequence that allows me to store the other double byte languages in the same database correctly.

I have found changing the data types from text, char, varchar to ntext, nchar, nvarchar and adding an N in front of the various strings that getting inserted seems to work:

insert into CONTENTDATA (recordid, xml)
values (newid(), N'<CHANNEL1><FILE1/><TEXT1><![CDATA[和红魔拉拉队的动感精神
]]></TEXT1><TEXT3><![CDATA[和红魔拉拉队的动感精神]]></TEXT3></CHANNEL1>');

But this is not going to be a practical solution for us. Is there a collation sequence that would allow us to store multiple locales like we do in Oracle (AL32UTF8)?

Thanks in advance

Dov RosenbergYou can store multiple languages in SQL Server (since version 7.0) using the Unicode data types as you have noted. In Setup, you can also choose the default Server collation.If you need a finer level of collation control subsequently, you can use the multiple collations per column feature in SQL Server 2000 / 2005. More details at http://msdn.microsoft.com/library/default.asp?url=/library/en-us/architec/8_ar_da_6ttf.asp, http://msdn.microsoft.com/SQL/2000/learn/internat/default.aspx.

Saturday, February 25, 2012

Data warehouse data refresh/update

Hello,
Im a member of a team planning a data warehousing project. We have multiple data sources which are aggregated in a staging area. This is then denormalised and imported into the datawarehouse database.
I am looking at ideas for incremental data refresh, rather than a drop and re-import of all data. This would allow us to have historic data.
Does anyone have any tips that might be helpful for detecting changes in the source data for import? We have had some bad experiences with triggers on our source database in the past, so would rather not use these. I have considered replication and log sh
ipping, but these just give a replica of the source data and does not flag the updated/new data.
Any help would be greatly appreciated.
Thanks.
Ben.
Look in SQL Books Online for CHECKSUM_AGG to identify changes in a table. The difficulty then is in trying to identify what has changed. Another route is to try and identify fields within the production data that will identify when it was last changed. Ty
pically these tend to be datetime, timestamp or rowversion data types.
If you keep production keys in your fact table as additional attributes then this will give you another option for identifying new data.
"Ben" wrote:

> Hello,
> Im a member of a team planning a data warehousing project. We have multiple data sources which are aggregated in a staging area. This is then denormalised and imported into the datawarehouse database.
> I am looking at ideas for incremental data refresh, rather than a drop and re-import of all data. This would allow us to have historic data.
> Does anyone have any tips that might be helpful for detecting changes in the source data for import? We have had some bad experiences with triggers on our source database in the past, so would rather not use these. I have considered replication and log
shipping, but these just give a replica of the source data and does not flag the updated/new data.
> Any help would be greatly appreciated.
> Thanks.
> Ben.
|||Timestamp your source data when it gets changed. That is definately the best way.
Rgards
Jamie
"Ben" wrote:

> Hello,
> Im a member of a team planning a data warehousing project. We have multiple data sources which are aggregated in a staging area. This is then denormalised and imported into the datawarehouse database.
> I am looking at ideas for incremental data refresh, rather than a drop and re-import of all data. This would allow us to have historic data.
> Does anyone have any tips that might be helpful for detecting changes in the source data for import? We have had some bad experiences with triggers on our source database in the past, so would rather not use these. I have considered replication and log
shipping, but these just give a replica of the source data and does not flag the updated/new data.
> Any help would be greatly appreciated.
> Thanks.
> Ben.

Data warehouse data refresh/update

Hello,
Im a member of a team planning a data warehousing project. We have multiple
data sources which are aggregated in a staging area. This is then denormalis
ed and imported into the datawarehouse database.
I am looking at ideas for incremental data refresh, rather than a drop and r
e-import of all data. This would allow us to have historic data.
Does anyone have any tips that might be helpful for detecting changes in the
source data for import? We have had some bad experiences with triggers on o
ur source database in the past, so would rather not use these. I have consid
ered replication and log sh
ipping, but these just give a replica of the source data and does not flag t
he updated/new data.
Any help would be greatly appreciated.
Thanks.
Ben.Look in SQL Books Online for CHECKSUM_AGG to identify changes in a table. Th
e difficulty then is in trying to identify what has changed. Another route i
s to try and identify fields within the production data that will identify w
hen it was last changed. Ty
pically these tend to be datetime, timestamp or rowversion data types.
If you keep production keys in your fact table as additional attributes then
this will give you another option for identifying new data.
"Ben" wrote:

> Hello,
> Im a member of a team planning a data warehousing project. We have multipl
e data sources which are aggregated in a staging area. This is then denormal
ised and imported into the datawarehouse database.
> I am looking at ideas for incremental data refresh, rather than a drop and
re-import of all data. This would allow us to have historic data.
> Does anyone have any tips that might be helpful for detecting changes in the sourc
e data for import? We have had some bad experiences with triggers on our source data
base in the past, so would rather not use these. I have considered replication and l
og
shipping, but these just give a replica of the source data and does not flag the updated/new
data.
> Any help would be greatly appreciated.
> Thanks.
> Ben.|||Timestamp your source data when it gets changed. That is definately the best
way.
Rgards
Jamie
"Ben" wrote:

> Hello,
> Im a member of a team planning a data warehousing project. We have multipl
e data sources which are aggregated in a staging area. This is then denormal
ised and imported into the datawarehouse database.
> I am looking at ideas for incremental data refresh, rather than a drop and
re-import of all data. This would allow us to have historic data.
> Does anyone have any tips that might be helpful for detecting changes in the sourc
e data for import? We have had some bad experiences with triggers on our source data
base in the past, so would rather not use these. I have considered replication and l
og
shipping, but these just give a replica of the source data and does not flag the updated/new
data.
> Any help would be greatly appreciated.
> Thanks.
> Ben.

Friday, February 17, 2012

Data type conversion chart

Is there such a thing as a data type conversion chart between multiple databases? I.E. what data types are comprable in what database. Oracle vs. SQL Server etc.

Specifically I have a long raw column in oracle and I'm unsure if it is the same data type in SQL Server.
Links would be appreciated!Books Online has a page entitled About Oracle Subscribers in SQL 7 and Oracle Subscribers in SQL 2000. It's a starting point