Jamie Thomson created an SSIS package you can download that loops through each dimension and measure group processing them individually.
http://consultingblogs.emc.com/jamiethomson/attachment/4235.ashx
This may lessen the load on your AS server.
The first part of the statement does a standard select and would normally give you all the records.
select dbo.IPTable.rowid, dbo.IPTable.[Name], dbo.IPTable.IP, dbo.IPTable.UA, dbo.IPTable.Date
from dbo.IPTable
The second part of the statement finds the minimum rowId for each unique...
Joining the table back to itself and grouping by the fields you want should do it.
select dbo.IPTable.rowid, dbo.IPTable.[Name], dbo.IPTable.IP, dbo.IPTable.UA, dbo.IPTable.Date
from dbo.IPTable
inner join ( select min(rowid) min_rowid from dbo.IPTable group by [Name], [IP], [UA] ) min_IPTable...
Untested but should work:
Select c.yyyy,
c.mm,
h.hhname,
ISNULL(Max(f.MessageCount),0) As HighestSendsToOneDevice,
ISNULL(Min(f.MessageCount),0) As LowestSendsToOneDevice,
ISNULL(Avg(cast(f.MessageCount as float)),0) As AverageSendsToOneDevice
From...
No problem, it's something I've worked on myself.
To get the File Name: In your Data Flow Task, you can use a Derived Column between your Flat File Source and your destination table. Use the Variables and drag the User::FileName expression to create a derived column. I used a DT_STR data type...
Ah, George beat me to it, I dug out a version without temporary tables.
CREATE TABLE #Duration([DurationId] int identity(1,1), [Company ID] varchar(3), [Time] datetime)
INSERT INTO #Duration([Company ID], [Time]) VALUES('AAA', '13:09')
INSERT INTO #Duration([Company ID], [Time]) VALUES('AAA'...
Glasgow: It doesn't do a pre-check on SQL 2005 SP2. I'm using a 64-bit flavour and have tested it using the script direct and an SP. How very odd! If you find out why, please post, I'm quite curious.
Could you check directly in INFORMATION_SCHEMA.Columns view?
e.g.
IF EXISTS (SELECT 1 FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'Client' AND COLUMN_NAME = 'VATNo')
BEGIN
UPDATE Client SET VatNo='1234 564 789'
END
I've got this to work using 2 variables, a script task and a ForEach loop container.
Variable #1 - WildCard at package scope.
Variable #2 - FileName at package scope.
Create a Script Task and set this to the wildcard desired e.g.
Dim FilePart As String
FilePart = Format(Now...
Not a direct answer to your question BUT there is a huge amount of indexing resource out there, but this I found to be a gem.
The Ultimate Missing Index Finder by Jesse Roberge.
http://www.sqlservercentral.com/scripts/Index+Management/63937/
If you need the currently logged in user including the domain name, SYSTEM_USER will give you this information.
As long as the stored proc used by SSRS is being executed using the currently logged in users credentials, all should be well.
Our current SSRS reports pass this information as a...
Rob Volk has created a dynamic cross-tabs/pivot table script that may meet your needs:
http://www.sqlteam.com/item.asp?ItemID=2955
This article generated a huge number of responses, it's worth reading through them:
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=6216
You can do this using the GETDATE function.
http://msdn.microsoft.com/en-us/library/ms188383(SQL.90).aspx
Example:
INSERT INTO tblgardiff_50
([net_sales]
,[gross_sales]
,[cash_in_draw]
,[cards_in_draw]
,[your-date-column-here])
VALUES
(@f1
,@f2
,@f3
,@f4
,GETDATE())
The column 1099flag does not conform to the rules for regular identifiers, square brackets are required e.g. [1099flag]
http://msdn.microsoft.com/en-us/library/ms175874(SQL.90).aspx
This will fix your first error but generate another: "Column 'table2.1099flag' is invalid in the select list...
INSERT INTO tblgardiff_50
([net_sales]
,[gross_sales]
,[cash_in_draw]
,[cards_in_draw])
VALUES
(@f1
,@f2
,@f3
,@f4
,@f5)
You have 4 columns in the INSERT section and 5 variables in the VALUES section.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.