Yeah, that's some pretty good advice there.
I discussed it with my MD today, on the assumption that UK terms were similar to US, and he wanted to steer clear of any grey 'joint ownership' area to the point that he forbid any in/out of hours project. He's liable to knee-jerk like that...
I did...
From a general industry point of view, who owns code we write while we are employed by a company? Say if we write code in and out of work time? Is there any agreement on this?
There's a couple of neat things I want to work on. Basically it'll be of use to the company, but I want to be free to...
yes, but the problem I was finding was in joining the orders table to itself on the date, making a lot of work becuase it had to extract a date from the timestamp (twice).
It seems that the problem isn't solved if I'm using timestamp and still have to convert in order to create the join. Or...
Yes I think I've got my head around it now.
I think the main reason it was dying was that i was doing from_unixtime for the timestamp on every row. I added a `date` field on which to join and it worked fine.
From what i've read it seems that, in 4.1, there is not a way to set a `date` field...
I have indexes on orderID, and a unix timestamp field (I don't have a 'date' field).
So I used t1.uts>unix_timestamp('2006-02-01') in the where statement.
I'm not sure why you'd include the orderTotal in the join condition. Maybe I've explained the problem/definition badly?
doesn't work. The group by day just limits the results to 1 entry per day, and shows details of the first order from the day. I can get totals for the day, but what I really want is the top 10 orders.
for example:
SELECT from_unixtime(uts,'%u') as week_number, count(*) as orders from ordersTable WHERE from_unixtime(uts,'%Y')=2006
GROUP BY from_unixtime(uts,'%u')
Giving
+-------------+--------+
| week_number | orders |
+-------------+--------+
| 18 | 20 |...
I feel I should be able to work this out, but I'm drawing a blank.
How would I retrieve, for example, the top 10 orders for each day for a period, where I order by orderTotal?
I can't seem to find any way to limit the results for each day, rather than the results as a whole.
Any ideas?
I backup my DB on linux using a bash script. Was a pain in the rear as I wanted to back up each table into a separate file, so i've got 30 different statements running. Maybe there was a better way, but i'm not that clever.
here's the script. If you are using windows then you'd need to change...
I can't seem to find an answer in the literature, but is it possible in 4.1 to place a lock on a particular record? Say, to prevent changes to the record while being viewed.
There's plenty on table locks but no record locks. Are these not supported?
Thanks for that. I agree it was a great answer.
Had some initial trouble with the former approach in that the subquery returned more than 1 row. Turned out I had some duplicate rows. A group by sorted that out.
When I ran each of them the first was much, much faster, taking fraction of a...
Recently I've noticed the mysql server going very slow at times. At such times i've run show full processlist and there's been one or two processes with a status sleep and time being anything up to about 40 seconds.
What are theses proceses? Are they read/write processes waiting for their...
I have a table of customer problems, and another table of cummunications relating to the problem.
I'm trying to pull out for each problem: details of the problem, the most recent comversation, and the timestamp of the oldest conversation.
I've got the latest conversation like so:
select *...
I Have this query which just takes too long and kills the server.
I have 2 left joins and want to do an inner join matching values in either of the left joins. It seems to be the use of an OR in this inner join which is the bottleneck.
select * from
categories c
INNER JOIN cat_subCat csc...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.