Thursday, March 29, 2012
Change Logical File Name Containing a Hyphen
a hyphen. When trying to change the size of the db and log files by running
the following SQL statement it throws an error stating the file name is not
listed in sysfiles:
-- Modify db file size and max size
alter database DB_NAME_DS
MODIFY FILE
(
NAME = [DB_NAME-DS_DATA],
SIZE = 50MB,
MAXSIZE = 100MB,
FILEGROWTH = 50MB
)
-- Modify log file size and max size
alter database DB_NAME_DS
MODIFY FILE
(
NAME = [DB_NAME-DS_Log],
SIZE = 50MB,
MAXSIZE = 100MB,
FILEGROWTH = 50MB
)
GO
I've verified the logical file names using SP_HELPDB. When the DB was
created I did not specify any logical file names. SQL Server assigned them
automatically and put the hyphens in. What is causing the problem with the
SQL command and how can I get around this problem.
Thanks for any help!Please post result of following:
sp_helpfile
"Skippy, DOT DBA" wrote:
> I'm running SQL Server 2000. I have a DB with a logical file name containing
> a hyphen. When trying to change the size of the db and log files by running
> the following SQL statement it throws an error stating the file name is not
> listed in sysfiles:
> -- Modify db file size and max size
> alter database DB_NAME_DS
> MODIFY FILE
> (
> NAME = [DB_NAME-DS_DATA],
> SIZE = 50MB,
> MAXSIZE = 100MB,
> FILEGROWTH = 50MB
> )
> -- Modify log file size and max size
> alter database DB_NAME_DS
> MODIFY FILE
> (
> NAME = [DB_NAME-DS_Log],
> SIZE = 50MB,
> MAXSIZE = 100MB,
> FILEGROWTH = 50MB
> )
> GO
> I've verified the logical file names using SP_HELPDB. When the DB was
> created I did not specify any logical file names. SQL Server assigned them
> automatically and put the hyphens in. What is causing the problem with the
> SQL command and how can I get around this problem.
> Thanks for any help!|||SP_HELPFILE results:
DB_NAME-DS_Data
1 <drive
letter>:\data\DB_NAME-DS_Data.MDF
PRIMARY 32896 KB Unlimited 10% data only
DB_NAME-DS_Log
2 <drive
letter>:\Logs\DB_NAME-DS_Log.LDF
NULL 102400 KB Unlimited 25600 KB log only
"Absar Ahmad" wrote:
> Please post result of following:
> sp_helpfile
> "Skippy, DOT DBA" wrote:
> > I'm running SQL Server 2000. I have a DB with a logical file name containing
> > a hyphen. When trying to change the size of the db and log files by running
> > the following SQL statement it throws an error stating the file name is not
> > listed in sysfiles:
> >
> > -- Modify db file size and max size
> > alter database DB_NAME_DS
> > MODIFY FILE
> > (
> > NAME = [DB_NAME-DS_DATA],
> > SIZE = 50MB,
> > MAXSIZE = 100MB,
> > FILEGROWTH = 50MB
> > )
> > -- Modify log file size and max size
> > alter database DB_NAME_DS
> > MODIFY FILE
> > (
> > NAME = [DB_NAME-DS_Log],
> > SIZE = 50MB,
> > MAXSIZE = 100MB,
> > FILEGROWTH = 50MB
> > )
> > GO
> >
> > I've verified the logical file names using SP_HELPDB. When the DB was
> > created I did not specify any logical file names. SQL Server assigned them
> > automatically and put the hyphens in. What is causing the problem with the
> > SQL command and how can I get around this problem.
> >
> > Thanks for any help!sql
Change location of RS log files?
How can I change the location where the RS log files are
stored/created? The .config file seems to have options to change the
filename, but will this allow me to specify a different folder as
well?
We would like to put these files on a seperate hard drive from our
main application drive, in case the log files ever happen to grow
exponentially and fill the entire DB again.
Matt BrownEach of the following files for MS Reporting Services has a section called
RStrace that contains the settings for the log files.
...\ReportServer\bin\ReportingServicesService.exe.config
â?¦\ReportServer\web.config
â?¦\ReportManager\web.config
A default installation writes the log files to the directory where Reporting
Services is installed under the LogFiles folder. In order to change the
default location of the log files you need to add the following line under
the RStrace section for each of the above files. Once you have added the
directory line you will need to restart the ReportServer service.
<add name=â'Directoryâ' value=â'your dirâ' />
Example:
<RStrace>
<add name="Directory" value="d:\RSLogs" />
<add name="FileName" value="ReportServerService_" />
<add name="FileSizeLimitMb" value="32" />
<add name="KeepFilesForDays" value="14" />
<add name="Prefix" value="tid, time" />
<add name="TraceListeners" value="debugwindow, file"
/>
<add name="TraceFileMode" value="unique" />
<add name="Components" value="all" />
</RStrace>
Sunday, March 25, 2012
change formating
is formated as 10/10/2005 13:23. That is the way teh file is sent to me.
I would like to be able to have it changed to just the 10/10/2005 format.
Is there an easy way to auto do that?
I currently have a tmp Table that the data gets readinto first, then it
updates the main table for any changes of exsisting records so there is room
for me to have somethign go in and modify the data.
Hi
You don't say how you are loading this file?
You could treat this as two fields and ignore the second field (time part).
If you are using BCP or BULK INSERT then you can specify a format file and if
you are using DTS or the import wizard you can also specify that it should
ignore this data.
If you are loading into a datatime datatype you will always be holding the
time portion, but this will be 00:00:00 if you don't specify the time. A
different alternative (and probably slower!) if you are using datatime would
to update the data post insert.
John
"Johnfli" wrote:
> I have a txt file that I import into my SQL2003 server. One of the items
> is formated as 10/10/2005 13:23. That is the way teh file is sent to me.
> I would like to be able to have it changed to just the 10/10/2005 format.
> Is there an easy way to auto do that?
> I currently have a tmp Table that the data gets readinto first, then it
> updates the main table for any changes of exsisting records so there is room
> for me to have somethign go in and modify the data.
>
>
|||I have a DTS package that is doing a bulk insert after reading teh text
file. Teh text file is camma delimted so I do not know how I would be able
to seperate teh time from teh date.
Here is a sample of the date being imported:
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9635","7977524ANJA","4",4,118,51.72,0.44,"N",,0,,0 ,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9635","7977525ANJA","4",4,118,51.71,0.44,"N",,0,,0 ,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9620","7977524ANJA","2",2,50,22.41,0.22,"N",,0,,0, ,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9620","7977525ANJA","2",2,50,22.41,0.22,"N",,0,,0, ,
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:D5D46761-C4B7-4557-825A-774BC97B8425@.microsoft.com...[vbcol=seagreen]
> Hi
> You don't say how you are loading this file?
> You could treat this as two fields and ignore the second field (time
> part).
> If you are using BCP or BULK INSERT then you can specify a format file and
> if
> you are using DTS or the import wizard you can also specify that it should
> ignore this data.
> If you are loading into a datatime datatype you will always be holding the
> time portion, but this will be 00:00:00 if you don't specify the time. A
> different alternative (and probably slower!) if you are using datatime
> would
> to update the data post insert.
> John
> "Johnfli" wrote:
|||Hi
You can specify that the source file is comma delimited and if you use an
ActiveX transformation you can then you can specify something like:
DTSDestination("Col002") = LEFT(DTSSource("Col002"),8)
in the properties dialog (The properties button is on the general tab)
John
"Johnfli" wrote:
> I have a DTS package that is doing a bulk insert after reading teh text
> file. Teh text file is camma delimted so I do not know how I would be able
> to seperate teh time from teh date.
> Here is a sample of the date being imported:
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9635","7977524ANJA","4",4,118,51.72,0.44,"N",,0,,0 ,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9635","7977525ANJA","4",4,118,51.71,0.44,"N",,0,,0 ,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9620","7977524ANJA","2",2,50,22.41,0.22,"N",,0,,0, ,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","0 9620","7977525ANJA","2",2,50,22.41,0.22,"N",,0,,0, ,
>
>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> news:D5D46761-C4B7-4557-825A-774BC97B8425@.microsoft.com...
>
>
change formating
is formated as 10/10/2005 13:23. That is the way teh file is sent to me.
I would like to be able to have it changed to just the 10/10/2005 format.
Is there an easy way to auto do that?
I currently have a tmp Table that the data gets readinto first, then it
updates the main table for any changes of exsisting records so there is room
for me to have somethign go in and modify the data.Hi
You don't say how you are loading this file?
You could treat this as two fields and ignore the second field (time part).
If you are using BCP or BULK INSERT then you can specify a format file and i
f
you are using DTS or the import wizard you can also specify that it should
ignore this data.
If you are loading into a datatime datatype you will always be holding the
time portion, but this will be 00:00:00 if you don't specify the time. A
different alternative (and probably slower!) if you are using datatime would
to update the data post insert.
John
"Johnfli" wrote:
> I have a txt file that I import into my SQL2003 server. One of the items
> is formated as 10/10/2005 13:23. That is the way teh file is sent to me
.
> I would like to be able to have it changed to just the 10/10/2005 format.
> Is there an easy way to auto do that?
> I currently have a tmp Table that the data gets readinto first, then it
> updates the main table for any changes of exsisting records so there is ro
om
> for me to have somethign go in and modify the data.
>
>|||I have a DTS package that is doing a bulk insert after reading teh text
file. Teh text file is camma delimted so I do not know how I would be able
to seperate teh time from teh date.
Here is a sample of the date being imported:
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977524ANJA","4",4,
118,51.72,0.44,"N",,0,,0,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977525ANJA","4",4,
118,51.71,0.44,"N",,0,,0,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977524ANJA","2",2,
50,22.41,0.22,"N",,0,,0,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977525ANJA","2",2,
50,22.41,0.22,"N",,0,,0,,
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:D5D46761-C4B7-4557-825A-774BC97B8425@.microsoft.com...[vbcol=seagreen]
> Hi
> You don't say how you are loading this file?
> You could treat this as two fields and ignore the second field (time
> part).
> If you are using BCP or BULK INSERT then you can specify a format file and
> if
> you are using DTS or the import wizard you can also specify that it should
> ignore this data.
> If you are loading into a datatime datatype you will always be holding the
> time portion, but this will be 00:00:00 if you don't specify the time. A
> different alternative (and probably slower!) if you are using datatime
> would
> to update the data post insert.
> John
> "Johnfli" wrote:
>|||Hi
You can specify that the source file is comma delimited and if you use an
ActiveX transformation you can then you can specify something like:
DTSDestination("Col002") = LEFT(DTSSource("Col002"),8)
in the properties dialog (The properties button is on the general tab)
John
"Johnfli" wrote:
> I have a DTS package that is doing a bulk insert after reading teh text
> file. Teh text file is camma delimted so I do not know how I would be ab
le
> to seperate teh time from teh date.
> Here is a sample of the date being imported:
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977524ANJA","4",
4,118,51.72,0.44,"N",,0,,0,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977525ANJA","4",
4,118,51.71,0.44,"N",,0,,0,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977524ANJA","2",
2,50,22.41,0.22,"N",,0,,0,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977525ANJA","2",
2,50,22.41,0.22,"N",,0,,0,,
>
>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> news:D5D46761-C4B7-4557-825A-774BC97B8425@.microsoft.com...
>
>
change formating
is formated as 10/10/2005 13:23. That is the way teh file is sent to me.
I would like to be able to have it changed to just the 10/10/2005 format.
Is there an easy way to auto do that?
I currently have a tmp Table that the data gets readinto first, then it
updates the main table for any changes of exsisting records so there is room
for me to have somethign go in and modify the data.Hi
You don't say how you are loading this file?
You could treat this as two fields and ignore the second field (time part).
If you are using BCP or BULK INSERT then you can specify a format file and if
you are using DTS or the import wizard you can also specify that it should
ignore this data.
If you are loading into a datatime datatype you will always be holding the
time portion, but this will be 00:00:00 if you don't specify the time. A
different alternative (and probably slower!) if you are using datatime would
to update the data post insert.
John
"Johnfli" wrote:
> I have a txt file that I import into my SQL2003 server. One of the items
> is formated as 10/10/2005 13:23. That is the way teh file is sent to me.
> I would like to be able to have it changed to just the 10/10/2005 format.
> Is there an easy way to auto do that?
> I currently have a tmp Table that the data gets readinto first, then it
> updates the main table for any changes of exsisting records so there is room
> for me to have somethign go in and modify the data.
>
>|||I have a DTS package that is doing a bulk insert after reading teh text
file. Teh text file is camma delimted so I do not know how I would be able
to seperate teh time from teh date.
Here is a sample of the date being imported:
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977524ANJA","4",4,118,51.72,0.44,"N",,0,,0,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977525ANJA","4",4,118,51.71,0.44,"N",,0,,0,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977524ANJA","2",2,50,22.41,0.22,"N",,0,,0,,
"ON HAND",12/19/2005
13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977525ANJA","2",2,50,22.41,0.22,"N",,0,,0,,
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:D5D46761-C4B7-4557-825A-774BC97B8425@.microsoft.com...
> Hi
> You don't say how you are loading this file?
> You could treat this as two fields and ignore the second field (time
> part).
> If you are using BCP or BULK INSERT then you can specify a format file and
> if
> you are using DTS or the import wizard you can also specify that it should
> ignore this data.
> If you are loading into a datatime datatype you will always be holding the
> time portion, but this will be 00:00:00 if you don't specify the time. A
> different alternative (and probably slower!) if you are using datatime
> would
> to update the data post insert.
> John
> "Johnfli" wrote:
>> I have a txt file that I import into my SQL2003 server. One of the
>> items
>> is formated as 10/10/2005 13:23. That is the way teh file is sent to
>> me.
>> I would like to be able to have it changed to just the 10/10/2005 format.
>> Is there an easy way to auto do that?
>> I currently have a tmp Table that the data gets readinto first, then it
>> updates the main table for any changes of exsisting records so there is
>> room
>> for me to have somethign go in and modify the data.
>>|||Hi
You can specify that the source file is comma delimited and if you use an
ActiveX transformation you can then you can specify something like:
DTSDestination("Col002") = LEFT(DTSSource("Col002"),8)
in the properties dialog (The properties button is on the general tab)
John
"Johnfli" wrote:
> I have a DTS package that is doing a bulk insert after reading teh text
> file. Teh text file is camma delimted so I do not know how I would be able
> to seperate teh time from teh date.
> Here is a sample of the date being imported:
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977524ANJA","4",4,118,51.72,0.44,"N",,0,,0,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09635","7977525ANJA","4",4,118,51.71,0.44,"N",,0,,0,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977524ANJA","2",2,50,22.41,0.22,"N",,0,,0,,
> "ON HAND",12/19/2005
> 13:21:50,"01670136754","GILBERT","AEMNL0512012","09620","7977525ANJA","2",2,50,22.41,0.22,"N",,0,,0,,
>
>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> news:D5D46761-C4B7-4557-825A-774BC97B8425@.microsoft.com...
> > Hi
> >
> > You don't say how you are loading this file?
> >
> > You could treat this as two fields and ignore the second field (time
> > part).
> > If you are using BCP or BULK INSERT then you can specify a format file and
> > if
> > you are using DTS or the import wizard you can also specify that it should
> > ignore this data.
> >
> > If you are loading into a datatime datatype you will always be holding the
> > time portion, but this will be 00:00:00 if you don't specify the time. A
> > different alternative (and probably slower!) if you are using datatime
> > would
> > to update the data post insert.
> >
> > John
> >
> > "Johnfli" wrote:
> >
> >> I have a txt file that I import into my SQL2003 server. One of the
> >> items
> >> is formated as 10/10/2005 13:23. That is the way teh file is sent to
> >> me.
> >> I would like to be able to have it changed to just the 10/10/2005 format.
> >>
> >> Is there an easy way to auto do that?
> >> I currently have a tmp Table that the data gets readinto first, then it
> >> updates the main table for any changes of exsisting records so there is
> >> room
> >> for me to have somethign go in and modify the data.
> >>
> >>
> >>
>
>
Change Flat File Source Dynamically
I am a relative newbie to SSIS. I have been tasked with writing packages to import data from our clients. We have about 100 clients. Each client has a few different file formats. None of the clients have the same format as each other. We load files from each client each day. Each day the file name changes. I have done all of my current development work with a constant file name in a text file connection manager.
Ultimately we will write a VB application for the computer operator to select the flat file to load and the SSIS package to load it with. I had been planning on accomplishing this thru the SSIS command line interface. Can I specify the flat file to load via a variable that is passed through the command line? Do I need to use a Script Component to take the variable and assign it to the connection manager?
Is there a better way to do this? I have seen glimpses of a VB interface to SSIS. Maybe that is a better way to kick off the packages from a VB app?
Thanks,
Chris
You can set the ConnectionString property of your flat file connection managers to be based on an expression, which is in turn based on a package variable. You can then set the value of the variable from the command line when you run DEXEC.
You can also use the Foreach Loop container to loop through all of the files in a given folder, and use the package variable as the loop enumerator, so that it will be automatically set to the name of the current file. If you know that files that are of a given format from a given client will always be in a specific folder, this might eliminate the need to have a custom UI and rely on operator input to kick off the ETL process. I don't know if this makes sense for your environment, but it sounds attractive to me.
I have not personally used the .NET API to execute packages, so I cannot comment on that approach.
|||Matthew,
Thank you! I knew that should be possible, but, I couldn't figure out how. Based on your hint (expression) I was able to find the expressions section on my Connection's properties pane. I added an expression the ConnectionString (path) to a variable. It works!
One small oddity... I wanted to set the variable in a Script Component for testing (rather than command line). However, I couldn't set the Script Component as the first object on my Data Flow since the Flat File Source object will not take a predecessor. Instead I created another Data Flow, made my main one dependent on that, and then added a "Source" Script Component to the new Data Flow... then I populated the new variable in that Script Component.
Thanks,
Chris
|||
tofferr wrote:
One small oddity... I wanted to set the variable in a Script Component for testing (rather than command line). However, I couldn't set the Script Component as the first object on my Data Flow since the Flat File Source object will not take a predecessor. Instead I created another Data Flow, made my main one dependent on that, and then added a "Source" Script Component to the new Data Flow... then I populated the new variable in that Script Component.
Thanks,
Chris
Why not use a Script Task immediately before your data flow task to set the variable?
|||
jwelch wrote:
Why not use a Script Task immediately before your data flow task to set the variable?
That's a good idea... but, how do I access the varibales? Variables.variableName is not getting recognized in the script component editor, even after I added it to the ReadWriteVariables list.
Thanks,
Chris
|||This article from Jamies is right on point for much of this.
http://blogs.conchango.com/jamiethomson/archive/2006/03/11/3063.aspx
Chris
|||Dts.Variables("NameofVar") should work in the Script Task.|||
jwelch wrote:
Dts.Variables("NameofVar") should work in the Script Task.
You also need to remember to specify the (case-sensitive!) variable's name in the ReadWriteVariables property of the Script task for this to work.
|||
tofferr wrote:
Matthew,
Thank you! I knew that should be possible, but, I couldn't figure out how. Based on your hint (expression) I was able to find the expressions section on my Connection's properties pane. I added an expression the ConnectionString (path) to a variable. It works!
Thanks for the feedback!
If you ask me, SSIS Expressions are the most powerful single feature in the SSIS toolset. Being able to declaratively bind arbitrary expressions to arbitrary properties of arbitrary components (with a few restrictions, of course), so that the expression is evaluated whenever the property is accessed, is incredibly powerful and cool. So many people I talk to use SSIS without having a real "light bulb moment" about expressions, but to me they are a primary tool for making SSIS packages configurable, maintainable and reusable.
I feel a blog post coming on...
Change filename for flat file connection manager at runtime
I need to add the current date to the end of the filename of a flat file at runtime.
This was previously done with ActiveX script in SQL Server 2000.
oConn = DTSGlobalVariables.Parent.Connections("FlatFileConnectionManagerName")
oConn.DataSource = sNewFileName
oConn = Nothing
I would really appreciate if someone could give me some information on how to achieve this in SQL Server 2005.
Thanks in advance!
Regards,
Sara
Basically, use property expressions: http://blogs.conchango.com/jamiethomson/archive/2006/06/28/4156.aspx
-Jamie
|||
Hi Sara
I am working with SSIS packages and am having the same problem that you faced some time back
Am getting an error message in SSIS package in the following lines:
Set oConn = DTSGlobalVariables.Parent.Connections("Text File (Destination)")
oConn.DataSource = sFilename
Set oConn = Nothing
Could you tell me what you did to fix your code. Would really appreciate your time and response.
Thanks
Rishi...
|||Hi Rishi,
I did something to fix it then, but I would't solve it the same way today. The following example gets data from a file with a date in it. The date is always the current date so the Connection string needs to be updated to be able to read from the file.
I hope this gives you an idea of how to solve your problem!
Regards,
Sara
--
Script Task
Connection Manager Name: L:\VPKBA\Laddning\EDBKjerne\konto_YYYYMMDD.txt
File Name: L:\VPKBA\Laddning\EDBKjerne\konto.txt
Imports System
Imports System.IO
Imports System.Data
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
Public Sub Main()
Dim theDate As String
Dim i As Integer
strPath = "L:\vpkba\laddning\EDBkjerne\"
theDate = Now.Year & IIf(CInt(Now.Month) < 10, "0" & Now.Month, Now.Month).ToString & IIf(CInt(Now.Day) < 10, "0" & Now.Day, Now.Day).ToString
For i = 0 To Dts.Connections.Count - 1
If Dts.Connections(i).Name.Contains("YYYYMMDD") Then
Dts.Connections.Item(i).ConnectionString = Dts.Connections.Item(i).Name.Replace("YYYYMMDD", theDate)
End If
Next
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
Change filename for flat file connection manager at runtime
I need to add the current date to the end of the filename of a flat file at runtime.
This was previously done with ActiveX script in SQL Server 2000.
oConn = DTSGlobalVariables.Parent.Connections("FlatFileConnectionManagerName")
oConn.DataSource = sNewFileName
oConn = Nothing
I would really appreciate if someone could give me some information on how to achieve this in SQL Server 2005.
Thanks in advance!
Regards,
Sara
Basically, use property expressions: http://blogs.conchango.com/jamiethomson/archive/2006/06/28/4156.aspx
-Jamie
|||
Hi Sara
I am working with SSIS packages and am having the same problem that you faced some time back
Am getting an error message in SSIS package in the following lines:
Set oConn = DTSGlobalVariables.Parent.Connections("Text File (Destination)")
oConn.DataSource = sFilename
Set oConn = Nothing
Could you tell me what you did to fix your code. Would really appreciate your time and response.
Thanks
Rishi...
|||Hi Rishi,
I did something to fix it then, but I would't solve it the same way today. The following example gets data from a file with a date in it. The date is always the current date so the Connection string needs to be updated to be able to read from the file.
I hope this gives you an idea of how to solve your problem!
Regards,
Sara
--
Script Task
Connection Manager Name: L:\VPKBA\Laddning\EDBKjerne\konto_YYYYMMDD.txt
File Name: L:\VPKBA\Laddning\EDBKjerne\konto.txt
Imports System
Imports System.IO
Imports System.Data
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
Public Sub Main()
Dim theDate As String
Dim i As Integer
strPath = "L:\vpkba\laddning\EDBkjerne\"
theDate = Now.Year & IIf(CInt(Now.Month) < 10, "0" & Now.Month, Now.Month).ToString & IIf(CInt(Now.Day) < 10, "0" & Now.Day, Now.Day).ToString
For i = 0 To Dts.Connections.Count - 1
If Dts.Connections(i).Name.Contains("YYYYMMDD") Then
Dts.Connections.Item(i).ConnectionString = Dts.Connections.Item(i).Name.Replace("YYYYMMDD", theDate)
End If
Next
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
Tuesday, March 20, 2012
change database owner
Hello,
I restored and renamed a client database from a backup file to my laptop (both SQL Express 2005) and noticed that no owner had been assigned. I set the db-owner to SA. This database will replace a local version of the same database that was created on my laptop. (The customer has entered data...)
When I start the dataset configuration wizard in VS2005 all tables appear twice in the list. Once with the original name and once with the prefix dbo. For example: Customer and dbo.Customer.
The tables with the dbo. prefix are marked with a Red Cross in the checkbox. Hovering over them display the error message: "Element .. in the dataset references an object missing from the database. "
The tables without the dbo. prefix are not marked. When I mark these tables VS2005 indicates that the table will be renamed to Customer1. The same issue applies to all the views in the database.
Is there a way to solve this problem without having to recreate the entire dataset? I'd rather not do that cause I added some queries to the dataset and they will be lost and have to be recreated.
Any help will be appreciated.
hi,
I could not reproduce the problem... even not assigning a valid db owner to the restored db, the VS designer did not present a doubled list of db objects...
I can only think verifying the user your login is bound is not member of the dbo schema... try smodifying the "default" user's schema to dbo as well...
regards
Monday, March 19, 2012
Change Data Type
ran the upsize wizard to put the tabel in SQL. It put the table in ok
but all the data types are nvarchar. I have a couple of the fields that
are cureny and some that are numeric.
I need to change the data types from nvarchar to numeric type fields. I
am new to SQL so I do not know all the commands. How do I change the
data type?
Michael Charney
*** Sent via Developersdex http://www.developersdex.com ***ALTER TABLE table_name ALTER COLUMN col_name VARCHAR(10) ;
--
David Portas
SQL Server MVP
--
"Nothing" <me@.you.com> wrote in message
news:1Je1f.13$vU5.1288@.news.uswest.net...
>I received a db2 data file that I converted to MS-Access. From Access I
> ran the upsize wizard to put the tabel in SQL. It put the table in ok
> but all the data types are nvarchar. I have a couple of the fields that
> are cureny and some that are numeric.
> I need to change the data types from nvarchar to numeric type fields. I
> am new to SQL so I do not know all the commands. How do I change the
> data type?
> Michael Charney
> *** Sent via Developersdex http://www.developersdex.com ***
Change data source name
I have created my reports using MDB file. Now I have changed my DB to SQL Server.
I can go and change my Datasource loaction on my machine and the reports work fine. But I need to distribute these reports to all my clients.
And all have different server names.
I am using CRX and VB.NET.
TnxYou can change the location by using Datasource property(or its equivalent) in vb.net
Change CSV file to XLS
I want to automate my performance monitoring Reports(sql2000). So I need to import performance monitor output log file .CSV to a table. I feel converting this .CSV to .XLS is good way to importing the log thru DTS. Is there any command/ script to change the file type .CSV to .XLS, so that i can include this one into DTS Step.
thanksOriginally posted by krishna
Hi,
I want to automate my performance monitoring Reports(sql2000). So I need to import performance monitor output log file .CSV to a table. I feel converting this .CSV to .XLS is good way to importing the log thru DTS. Is there any command/ script to change the file type .CSV to .XLS, so that i can include this one into DTS Step.
thanks
Why can you not import the csv file? What advantages does importing a .XLS file provide?|||because, it is easy to import excel file than csv format.|||DTS can handle .CSV files just as easy as it can handle .XLS files that is why I was askng for advantages. Sorry I couldn't help.
Originally posted by krishna
because, it is easy to import excel file than csv format.|||I don't see any issues in handling .CSV or .XLS by Excel and SQL server DTS.|||what datasouce i should use ? Microsoft Text-Treiber(*.txt,*.csv)?
If i use this one as datasource for to import CSV file into table it is not converting datetime column. I was thinking of using Microsoft Excel for the datasource thats why i asked about converting csv to xls
Sunday, March 11, 2012
Change config file
I am working with the alias and host names. I have the alias set to
true, but it only puts in john_g.
I need it to add '@.domain.com' Can that be done in the config using the
DefaultHostName? If so, just put domain.com in?
Thanks.You don't have to reboot, but you should restart the ReportServer service
(in Control Panel) after changing config files.
You might also want to run iisreset.exe, depending on the changes you are
making -- but that's more for when you have GUI code changes or are tweaking
custom extensions.
--
Cheers,
'(' Jeff A. Stucker
\
Business Intelligence
www.criadvantage.com
---
"John Geddes" <john_g@.alamode.com> wrote in message
news:%231A%23vsN9EHA.3792@.TK2MSFTNGP10.phx.gbl...
> Do I have to reboot after changing RSReportServer.config?
> I am working with the alias and host names. I have the alias set to true,
> but it only puts in john_g.
> I need it to add '@.domain.com' Can that be done in the config using the
> DefaultHostName? If so, just put domain.com in?
> Thanks.
>|||Yes, you want to use the defaultHostName element. You do not need to reboot
after changing the config file.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"John Geddes" <john_g@.alamode.com> wrote in message
news:%231A%23vsN9EHA.3792@.TK2MSFTNGP10.phx.gbl...
> Do I have to reboot after changing RSReportServer.config?
> I am working with the alias and host names. I have the alias set to true,
> but it only puts in john_g.
> I need it to add '@.domain.com' Can that be done in the config using the
> DefaultHostName? If so, just put domain.com in?
> Thanks.
>
Change column datatype "$2,000" to 2000
Hello,
How do I change the datatype of a column in a CSV file. Preferably in the select statement (apparently Cast, Convert, & Replace functions don't work when selecting from a CSV).
I have a page where users upload their CSV files to and then I use SQLBulkCopy to insert all the records into my sql table. The problem is with columns of money data.
EX: "$2,000" >> 2000
The CSV file interprets the "$2,000" as a string of text. How do I convert it to a decimal so I can insert it into my sql database?
The CSV doesn't interpret anything, CSV files have no data types, it's all just text. You can use a text editor to do a search and replace and remove all the dollar signs and periods if you wish. You should be able to set the SQL data type for currency as well. Or use a DTS job to change the data format instead of doing a bulk copy.
Jeff
|||Hi Jeff,
As I said, users are uploading CSV files to my site where I use SQLBulkCopy to automatically upload the records to a SQL table. So that means no manual editing of the csv file and no using DTS.
Or can you do a "DTS job" via code? If so I've never done that before and a code example would be much appreciated.
You're right, I shouldn't have said CSV interprets it. I meant while reading from the CSV to a datatable, etc it is interpreted as string data.
Thanks a bunch,
|||Moderators: please stop marking this post as resolved/answered. It is obvious Jeff's response is NOT the answer. Marking it so only removes the chance of my question actually being answered.
|||You can probably trigger a DTS job via code, I've never looked at whether this could be done or how. You may need to parse the CSV file in code as it is uploladed, before the bulk copy, to strip the string portions out and format the currency as a number. I'm not sure if a SQLBulkCopy can change data types, I'm not familiar enough with it to say.
Jeff
PS: This isn't the answer either I'm afraid... :)
Change collation with osql
I have an MS SQL Server 2000 database with collation
'SQL_Hungarian_CP1250_CI_AS'. I wanted to run a script file to insert
values into one of my table by osql.exe the following way:
OSQL.EXE -U myUserID -P mypassword -d myDB -i "C:\myscript.sql" -n
The script was run successfully but the inserted character fields
looked as if it were in different collation.
For example the script contains the following insert command:
INSERT INTO admFunction
([FunctionID],[FunctionName],[FunctionMo
des],[ParentFunctionID],[OrderNo],[Modul
ID],[Extra1ID],[Enabled])
VALUES (1,'Vevői ajnlat',8,85,1,1,100,1)
but the inserted 'FunctionName' field contained this after running the
script:
'Vevi ajnlat'
I suppose it is a collation problem but i don't know how to solve it.
Any suggestion?
Thanks in advance
szoli
---
szoli's Profile: http://www.msmcse.ms/member.php?userid=1477
View this thread: http://www.msmcse.ms/t-1870445621If FunctionName is of type nvarchar, try the following:
INSERT INTO admFunction
([FunctionID],[FunctionName],[FunctionMo
des],[ParentFunctionID],[OrderNo],[Modul
ID],[Extra1ID],[Enabled]) VALUES (1,N'Vevői ajnlat',8,85,1,1,100,1)Joo Ara
jo"szoli" <szoli.1pc1ga@.no-mx.msmcse.ms> wrote in messagenews:szoli.1pc1g
a@.no-mx.msusenet
.com...>> Hi,>> I have an MS SQL Server 2000 database with collation> 'SQL_H
ungarian_CP1250_CI_AS'. I wanted to run a script file to insert> values into
one of my table by osql.exe the following way:>> OSQL.EXE -U myUserID -P my
password -d myDB -i "C:\mys
cript.sql" -n>> The script was run successfully but the inserted character fields> looked as if it wer
e in different collation.> For example the script contains the following insert command:>> INSERT INTO
admFunction>([FunctionID],[FunctionName],[Function
Modes],[ParentFunctionID],[OrderNo],[Mod
ulID],[Extra1ID],[Enabled])> VALUES (1,'Vevői ajnlat',8,85,1,1,100,1)>> but
the inserted 'FunctionName' field contained this after running the> script:
> 'Vevi ajnlat'>> I suppose it is a collation problem bu
t i don't know how to solve it.> Any suggestion?>> Thanks in advance>>> --> szoli> --
----> szoli's Profile: http://www.msusen
et.com/member.php?userid=1477> View this thread: http://www.msus
enet.com/t-1870445621>|||If FunctionName is of type nvarchar, try the following:
INSERT INTO admFunction
([FunctionID],[FunctionName],[FunctionMo
des],[ParentFunctionID],[OrderNo],[Modul
ID],[Extra1ID],[Enabled])VALUES (1,N'Vevői ajnlat',8,85,1,1,100,1)Joo Araj
o"szoli" <szoli.1pc1ga@.no-mx.msmcse.ms> wrote in messagenews:szoli.1pc1ga
@.no-mx.msusenet.
com...>> Hi,>> I have an MS SQL Server 2000 database with collation> 'SQL_Hu
ngarian_CP1250_CI_AS'. I wanted to run a script file to insert> values into
one of my table by osql.exe the following way:>> OSQL.EXE -U myUserID -P myp
assword -d myDB -i "C:\mysc
ript.sql" -n>> The script was run successfully but the inserted character fields> looked as if it were
in different collation.> For example the script contains the following insert command:>> INSERT INTO
admFunction>([FunctionID],[FunctionName],[FunctionM
odes],[ParentFunctionID],[OrderNo],[Modu
lID],[Extra1ID],[Enabled])> VALUES (1,'Vevői ajnlat',8,85,1,1,100,1)>> but
the inserted 'FunctionName' field contained this after running the> script:>
'Vevi ajnlat'>> I suppose it is a collation problem but
i don't know how to solve it.> Any suggestion?>> Thanks in advance>>> --> szoli> --
----> szoli's Profile: http://www.msusenet
.com/member.php?userid=1477> View this thread: http://www.msuse
net.com/t-1870445621>|||Finallly i found the solution.
I had to save my sql script in Query Analyzer in Unicode format and
then it worked well.
szoli
---
szoli's Profile: http://www.msmcse.ms/member.php?userid=1477
View this thread: http://www.msmcse.ms/t-1870445621
Thursday, March 8, 2012
change authentication method in a local database file
I have made a small asp.net project which uses a local database file as a part of the project. The project is running fine om my local machine, but when I upload it to the remote server, the login fails for the server.
I suspect this is can be solved by using sqlserver authentication. But I have now spent a lot of time trying to configure the database file to use this authentication mode. As I see it there are three possible solutions to the problem.
- use management studio express to configure the local mdf file (Ecxept that I cant find out how to connect to the mdf-file) and from here change the authentication method to sqlserver authentication. use Visual Web developer to change the authentication method (but how?) make the windows authentication work on the server (this would probably require that mannamgement studio express connects to the remote database. (Same problem as no 1)
Help will be higly appreciated.
Bjarke
hi Bjarke,
to make things run, the account running the application server must be one of the SQL Server registered logins (at the instance level and not database level).. if your solution is hosted, I do think the IT stuff of the hosting company provides that "by default" or nothing would run, and I do not think they will grant you a standard SQL Server login... did you contact them?
regards
|||Hello Andrea!
Maybe I have misunderstood something?
When you make an asp.net 2.0 project there is a folder (App_Data) where sit is possible via Visual web developer to add a sql database file. As I have understood it this file works as a little database server as well. Does SQL server even have to be installed on the server to make these files work as a database?
I have not contacted the hosting company about this problem, because I didnt consider it a problem. But Maybe I am wrong.
Do you know how to change the autnentication method for a local database file? I have tried a lot of things, and searched the internet for an answer, but without luck.
Thanks Bjarke
|||hi Bjarke,
bjarke wrote:
Hello Andrea!
Maybe I have misunderstood something?
When you make an asp.net 2.0 project there is a folder (App_Data) where sit is possible via Visual web developer to add a sql database file. As I have understood it this file works as a little database server as well.
perhaps you are intending User Istances "mode", a feature available only with SQLExpress edition of SQL Server 2005.. remember that this feature is not provided by hoster that do not have SQLExpress but "traditional" SQL Server 2005 editions...
Does SQL server even have to be installed on the server to make these files work as a database?
yes, SQLExpress must be installed on the computer running the application, in your case the web application, thus the application server... when the app start's up, a User Instance of SQLExpress is generated (if not already existing)...
I have not contacted the hosting company about this problem, because I didnt consider it a problem. But Maybe I am wrong.
Do you know how to change the autnentication method for a local database file? I have tried a lot of things, and searched the internet for an answer, but without luck.
Thanks Bjarke
the "traditional" authentication "setting" can be changed using SSMSE, accessing the instance property, in the security tab or modifying a Windows registry key.. stay with the first one
the service must be restarted to use the new settings..
as regard user instances, only Windows authentication can be used with this "access mode"...
if you mean how to modify the authentication settings in the connection strings, you can have a look here..
regards
Wednesday, March 7, 2012
Chang default file name when exporting
file for the report. Is there any way to change that to default to a value
in a textbox in the report?
StephanieOn Jul 25, 3:56 pm, Stephanie <Stepha...@.discussions.microsoft.com>
wrote:
> When exporting a report, the file name is defaulted to the name of the RDL
> file for the report. Is there any way to change that to default to a value
> in a textbox in the report?
> Stephanie
As far as I know, there is not. One way to do this is to create a
custom ASP.NET application that uses a report viewer control and
intercepts the export functionality of the control and changes the
name of the file to save. Sorry that I could not be of greater
assistance.
Regards,
Enrique Martinez
Sr. Software Consultant
challenging search task is not working as expected
Hi Guys,
I have two tables called table1 and table2.
table1 has search words and table2 has file names as below and want to get file names from table2 those match with all search words.
table1
-
-searchword- column name
--
Learn more about melons row0
--
%.txt row1
-
table2
-testname- column name
--
FKOV43C6.EXE
-
frusdr.txt
-
FRUSDR.TXT
SPGP_FWPkg_66G.zip
readme.txt
--
README.TXT
-
watermelon.exe
-
Learn more about melons read me.txt
-
Here is the script what I have tried...............I hope some one will help to come out this loop.Thanks in advance.
===============================================================================
select * from @.table2 t2 where t2.[testname] in (
SELECT tb.[testname] FROM @.table1 ta
JOIN @.table2 tb ON '% ' + tb.[testname] + ' %' LIKE '% ' + ta.searchword + ' %'
group by tb.[testname] having count(*) = (SELECT COUNT(*) FROM @.table1)
)
===============================================================================
script to create tables
============================================================================
DECLARE @.table1 TABLE (
searchword VARCHAR(255)
)
INSERT INTO @.table1 (
searchword
) VALUES ( 'Learn more about melons' )
INSERT INTO @.table1 (
searchword
) VALUES ( '%.txt' )
DECLARE @.table2 TABLE (
testname VARCHAR(255)
)
INSERT INTO @.table2 (
testname
) VALUES ( 'FKOV43C6.EXE' )
INSERT INTO @.table2 (
testname
) VALUES ('frusdr.txt' )
INSERT INTO @.table2 (
testname
) VALUES ('FRUSDR.TXT' )
INSERT INTO @.table2 ( testname
) VALUES ( 'SPGP_FWPkg_66G.zip' )
INSERT INTO @.table2 (
testname
) VALUES ( 'readme.txt' )
INSERT INTO @.table2 (testname
) VALUES ('README.TXT' )
INSERT INTO @.table2 (testname) VALUES (
'watermelon.exe' )
INSERT INTO @.table2 (
testname
) VALUES ('Learn more about melons read me.txt' )
SELECT * FROM @.table2
DECLARE @.table3 TABLE (
testname VARCHAR(255)
)
INSERT INTO @.table2 (
testname
) VALUES ('Melon release NOTES 321.xls' )
===================================================================================
Here it is:DECLARE @.num AS int;
SELECT @.num = COUNT(*) FROM @.table1;
SELECT tb.[testname] FROM @.table1 ta
cross JOIN (select distinct * from @.table2) tb
where tb.[testname] LIKE + '%' + ta.searchword + '%'
group by tb.[testname] having count(ta.searchword) = @.num
|||
maybe you also should take an look on "full text index":
f. ex. contains-function
|||Hi Zuomin,
Thank you very much.It worked perfectly. Thanks for spending your valuable time.
challenging search task is not working as expected
Hi Guys,
I have two tables called table1 and table2.
table1 has search words and table2 has file names as below and want to get file names from table2 those match with all search words.
table1
-
-searchword- column name
--
Learn more about melons row0
--
%.txt row1
-
table2
-testname- column name
--
FKOV43C6.EXE
-
frusdr.txt
-
FRUSDR.TXT
SPGP_FWPkg_66G.zip
readme.txt
--
README.TXT
-
watermelon.exe
-
Learn more about melons read me.txt
-
Here is the script what I have tried...............I hope some one will help to come out this loop.Thanks in advance.
===============================================================================
select * from @.table2 t2 where t2.[testname] in (
SELECT tb.[testname] FROM @.table1 ta
JOIN @.table2 tb ON '% ' + tb.[testname] + ' %' LIKE '% ' + ta.searchword + ' %'
group by tb.[testname] having count(*) = (SELECT COUNT(*) FROM @.table1)
)
===============================================================================
script to create tables
============================================================================
DECLARE @.table1 TABLE (
searchword VARCHAR(255)
)
INSERT INTO @.table1 (
searchword
) VALUES ( 'Learn more about melons' )
INSERT INTO @.table1 (
searchword
) VALUES ( '%.txt' )
DECLARE @.table2 TABLE (
testname VARCHAR(255)
)
INSERT INTO @.table2 (
testname
) VALUES ( 'FKOV43C6.EXE' )
INSERT INTO @.table2 (
testname
) VALUES ('frusdr.txt' )
INSERT INTO @.table2 (
testname
) VALUES ('FRUSDR.TXT' )
INSERT INTO @.table2 ( testname
) VALUES ( 'SPGP_FWPkg_66G.zip' )
INSERT INTO @.table2 (
testname
) VALUES ( 'readme.txt' )
INSERT INTO @.table2 (testname
) VALUES ('README.TXT' )
INSERT INTO @.table2 (testname) VALUES (
'watermelon.exe' )
INSERT INTO @.table2 (
testname
) VALUES ('Learn more about melons read me.txt' )
SELECT * FROM @.table2
DECLARE @.table3 TABLE (
testname VARCHAR(255)
)
INSERT INTO @.table2 (
testname
) VALUES ('Melon release NOTES 321.xls' )
===================================================================================
Here it is:DECLARE @.num AS int;
SELECT @.num = COUNT(*) FROM @.table1;
SELECT tb.[testname] FROM @.table1 ta
cross JOIN (select distinct * from @.table2) tb
where tb.[testname] LIKE + '%' + ta.searchword + '%'
group by tb.[testname] having count(ta.searchword) = @.num
|||
maybe you also should take an look on "full text index":
f. ex. contains-function
|||Hi Zuomin,
Thank you very much.It worked perfectly. Thanks for spending your valuable time.
Saturday, February 25, 2012
Certain numeric fields not read from the Excel file when using a Excel file source.
I have the Excel Connection Manager and Source to read the contents from an Excel file. For some reason couple of numeric fields from the Excel worksheet are brought over as nulls even though they have a value of 300 and 150. I am not sure why this is happening. I looked into the format of the fields and they are set to General in Excel, I tried setting them to numeric and that did not help.
All the other content from the excel file is coming thru except for the 2 numeric fields.
I tried to bring the contents from the excel source to a text file in csv format and for some reason the 2 numeric fields came out as blank.
Any inputs on getting this addressed will be much appreciated.
Thanks,
Manisha
Try doing a data conversion of this column in the transformation phase using the Derived Column Transformation or the Data Conversion Transformation. Hope that works for you.|||It has to do with a registry setting. I got a response from some other forum and I tried it and it worked.
|||I am having a similar issue. I am importing an Excel sheet from a customer of ours into a SQL Server 2005 table. The first column is text and imports OK. The 2nd column has numbers, but the first two rows import as nulls. If I open the sheet and retype the numbers exactly as they appear, then the sheet imports OK. Each time they send us this sheet I have to retype the numbers in the first two rows. If I import the sheet into Access (using the Access Import wizard), Access does not have a problem and the first two numbers show up OK in Access (without my having to retype them first). How were you able to resolve this issue? Thanks.|||Al C. wrote:
I am having a similar issue. I am importing an Excel sheet from a customer of ours into a SQL Server 2005 table. The first column is text and imports OK. The 2nd column has numbers, but the first two rows import as nulls. If I open the sheet and retype the numbers exactly as they appear, then the sheet imports OK. Each time they send us this sheet I have to retype the numbers in the first two rows. If I import the sheet into Access (using the Access Import wizard), Access does not have a problem and the first two numbers show up OK in Access (without my having to retype them first). How were you able to resolve this issue? Thanks.
Can you set up the Excel connector to import all fields as text fields, and then later convert them to numerics using a derived column transformation?|||
Phil,
TFYR. I verified in the Excel Source (Advanced Editor) that it is using DT_WSTR in both the External columns and Output columns. When I click the Preview button it shows NULL in the first two rows. If I add a DataViewer it also shows nulls. Evidently the first two rows are considered text even though the values are numeric. I can see the difference when I use the formula auditing tool and the first two rows align to the left (indicating text) and the others align to the right (numeric). So what I have gathered is that Excel samples the first nn rows to determine the datatype and since most values are numeric it deems the datatype of the column to be numeric and so it replaces the first two values with nulls. If I modify the Excel connection string and add 'IMEX=1' (import mode=1) to the extended properties, then it works OK.