Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Search results for query: *

  1. jrsansom

    Identical Logins producing different results

    Hi, Thank you for your reply. Here is the identical code run for both logins. SELECT TransactionOrigin.Description , COUNT(AccountTransaction.SettlementAmount) as 'Number of Transactions', SUM(AccountTransaction.SettlementAmount) as 'Value' FROM CustomerAccount INNER JOIN...
  2. jrsansom

    Identical Logins producing different results

    Morning, I have come across a bizarre problem in SQL Server 2005 that has me stumped. I have created a SQL Server Login (ReportReader) that has no server level roles assigned but does have the db_datareader role on a particular database. (The purpose of this account is to eventually be used...
  3. jrsansom

    Force computed column data type

    Hi, I have persisted the column to the database as the column will be counted on hundreds of times each month. It took under a minute to build on 3 million records so no need to be concerned about performance I think. Many Thanks, John
  4. jrsansom

    Force computed column data type

    Hi AlexCuse, Thank you for the response. That will certainly force the resulting data type to be BIT, however does this then mean that the cast calculation is performed for each column value, thereby resulting in additional overhead when compared to being able to just specify the data type of...
  5. jrsansom

    Force computed column data type

    Morning, I have a computed column on a SQL2K5 table. The data type is defaulting to int however I would like it to be stored as type bit. Is there anyway that I can force this to occur? The code for the column is case when [jobcode] like '%SR%' then (1) else (0) end Many Thanks, John
  6. jrsansom

    Inner join update query - understanding issue

    Well that is not entirely true becuase AC and CA are the same two codes but different strings :-)
  7. jrsansom

    Inner join update query - understanding issue

    Hi, Thanks for your reply. The function takes two strings containing codes and returns the unique coded string. So A and C become AC A A become A etc.
  8. jrsansom

    Inner join update query - understanding issue

    Afternoon, I have a query regarding performing an update using an inner join. Below is an example data set. What I wish to achieve is for the code column to be updated consistently across all matching records. ID Code CompanyGroup Contact Group 1 A 12345 84566 2 C 12345 84566 3 A 12345 84566...
  9. jrsansom

    Faster insert into a table with or without a clustered index

    Dear All, I currently have a problem requiring that I insert approximately 2 million rows into a database, 1 record at a time. Now I know that inserting a single row at a time is very in-efficient approach to this problem, given that operations should be set based however, I do not have an...
  10. jrsansom

    Using Temp tables in calculations

    Hi Manmaria, Personally I find that the spec of the server hosting your database has a very strong influence but the overall consensus seems to be to try to avoid using temporary tables where possible: either by using derived table: http://www.sql-server-performance.com/jg_derived_tables.asp...
  11. jrsansom

    Removing Cross duplicate from a data set

    Excellent gmmastros! I knew a left outer join and testing for null was a solution as this was the first avenue I explored but I ended up joining every record to every other record becuase I did not excluded matches where ID's where the same, as in the code you provided. I made a few changes to...
  12. jrsansom

    Removing Cross duplicate from a data set

    Hi Alex, Thank you for your reply. The problem submitted is a very small part of much larger project, the aim of which is to basically consolidate a variety of data sources and perform de-duplication between them. The value, pairs detailed in the problem represent what are known as duplicate...
  13. jrsansom

    Removing Cross duplicate from a data set

    Afternoon, I am struggling with what I thought should be an easy problem and would appreciate some assistance. I have a data set, an example is outlined below, that I want to contain only distinct relationships, so to speak, between two columns (source,duplicate). source | duplicate 1 2 1 3...
  14. jrsansom

    FTP Task not completing during DTS job

    Hi M, Thank you for the reply. 1.We can rule out the FTP permissions. 2.Timeout is a possibility. Where can this be set/defined? Below is the log output as requested. The execution of the following DTS Package succeeded: Package Name: TPS_Download_And_Deploy Package Description: (null)...
  15. jrsansom

    FTP Task not completing during DTS job

    Morning, I have a DTS job that appears to be failing on an FTP transfer task (see below). It is one of two DTS jobs that run overnight, both of which download files from an FTP site. The smaller DTS job/File(2MB) completes sucessfully however tha larger(32MB) does not. I suppose the two jobs...
  16. jrsansom

    Optimising a User Defined Function

    Hi, I was thinking along similar lines about this today. So in short you are saying it is more efficient to perform string manipulation via ActiveX controls than directly on the DB itself using T-SQL? To provide a little more detail, the field will be generated on numerous tables, in order to...
  17. jrsansom

    Optimising a User Defined Function

    Hi, Thank you again for your reply. The UDF is one of 6 UDF's that will need to be run once a month. So it is fair to say that on our current hardware setup this will take about an hour collectively per data set. Isolated this is not a problem but when you consider that this process is one of...
  18. jrsansom

    Optimising a User Defined Function

    Hi, Thanks for the reply. Having read the FAQ, would it then be fair to say that a batch (update) will only have a performance benifit if the log file is not already large enough to support the query. i.e. needs to grow to accomodate it. The reason I ask is testing thus far using a batch...
  19. jrsansom

    Optimising a User Defined Function

    Hi Everyone, I have a User Defined Function that currently takes about 6 minutes to process 100,000 records. Given that I need to process about 3 million records I am naturally keen to optimise this UDF if at all possible. The UDF basically creates a string token based on two strings passed as...

Part and Inventory Search

Back
Top