Hi,
Thank you for your reply.
Here is the identical code run for both logins.
SELECT
TransactionOrigin.Description ,
COUNT(AccountTransaction.SettlementAmount) as 'Number of Transactions',
SUM(AccountTransaction.SettlementAmount) as 'Value'
FROM CustomerAccount
INNER JOIN...
Morning,
I have come across a bizarre problem in SQL Server 2005 that has me stumped.
I have created a SQL Server Login (ReportReader) that has no server level roles assigned but does have the db_datareader role on a particular database. (The purpose of this account is to eventually be used...
Hi,
I have persisted the column to the database as the column will be counted on hundreds of times each month.
It took under a minute to build on 3 million records so no need to be concerned about performance I think.
Many Thanks,
John
Hi AlexCuse,
Thank you for the response.
That will certainly force the resulting data type to be BIT, however does this then mean that the cast calculation is performed for each column value, thereby resulting in additional overhead when compared to being able to just specify the data type of...
Morning,
I have a computed column on a SQL2K5 table. The data type is defaulting to int however I would like it to be stored as type bit.
Is there anyway that I can force this to occur?
The code for the column is
case when [jobcode] like '%SR%' then (1) else (0) end
Many Thanks,
John
Afternoon,
I have a query regarding performing an update using an inner join. Below is an example data set. What I wish to achieve is for the code column to be updated consistently across all matching records.
ID Code CompanyGroup Contact Group
1 A 12345 84566
2 C 12345 84566
3 A 12345 84566...
Dear All,
I currently have a problem requiring that I insert approximately 2 million rows into a database, 1 record at a time. Now I know that inserting a single row at a time is very in-efficient approach to this problem, given that operations should be set based however, I do not have an...
Hi Manmaria,
Personally I find that the spec of the server hosting your database has a very strong influence but the overall consensus seems to be to try to avoid using temporary tables where possible:
either by using derived table:
http://www.sql-server-performance.com/jg_derived_tables.asp...
Excellent gmmastros!
I knew a left outer join and testing for null was a solution as this was the first avenue I explored but I ended up joining every record to every other record becuase I did not excluded matches where ID's where the same, as in the code you provided.
I made a few changes to...
Hi Alex,
Thank you for your reply.
The problem submitted is a very small part of much larger project, the aim of which is to basically consolidate a variety of data sources and perform de-duplication between them.
The value, pairs detailed in the problem represent what are known as duplicate...
Afternoon,
I am struggling with what I thought should be an easy problem and would appreciate some assistance.
I have a data set, an example is outlined below, that I want to contain only distinct relationships, so to speak, between two columns (source,duplicate).
source | duplicate
1 2
1 3...
Hi M,
Thank you for the reply.
1.We can rule out the FTP permissions.
2.Timeout is a possibility. Where can this be set/defined?
Below is the log output as requested.
The execution of the following DTS Package succeeded:
Package Name: TPS_Download_And_Deploy
Package Description: (null)...
Morning,
I have a DTS job that appears to be failing on an FTP transfer task (see below). It is one of two DTS jobs that run overnight, both of which download files from an FTP site. The smaller DTS job/File(2MB) completes sucessfully however tha larger(32MB) does not.
I suppose the two jobs...
Hi,
I was thinking along similar lines about this today. So in short you are saying it is more efficient to perform string manipulation via ActiveX controls than directly on the DB itself using T-SQL?
To provide a little more detail, the field will be generated on numerous tables, in order to...
Hi,
Thank you again for your reply.
The UDF is one of 6 UDF's that will need to be run once a month. So it is fair to say that on our current hardware setup this will take about an hour collectively per data set. Isolated this is not a problem but when you consider that this process is one of...
Hi,
Thanks for the reply.
Having read the FAQ, would it then be fair to say that a batch (update) will only have a performance benifit if the log file is not already large enough to support the query. i.e. needs to grow to accomodate it.
The reason I ask is testing thus far using a batch...
Hi Everyone,
I have a User Defined Function that currently takes about 6 minutes to process 100,000 records. Given that I need to process about 3 million records I am naturally keen to optimise this UDF if at all possible.
The UDF basically creates a string token based on two strings passed as...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.