Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

php upload / download 3

Status
Not open for further replies.

kzn

MIS
Jan 28, 2005
209
GB
Hi

Not sure if this should be in the php section or the Apache, sorry if it is in the incorrect place. I have created a website for our company with php. Basically users can login and create a ticket but sometimes they need to send us files. I have created a basic upload page, but my question is .... surely this will have performance issues on our webserver, should we have a seperate machine upload/download? If this is the case how do people login and how is the session carried to the upload/download server? I am just so confused.

Many thanks in advance.
 
Interesting problem,
First off how big are the expected files ?,
How stressed will the web server be during potential upload times?
Your correct there will be an impact but it might be very small.
If you wanted to invisibly send a user to seperate server to do the upload I'm sure you could do it but you might be better off using that other machine to form a web farm with the existing server to share the overall system throughout.
If you already have a web farm and looking for a seperate server I appoligise for teaching you to suck eggs.
I would think you might be able to exploit AJAX to implement an upload control (check out something like JQuery)
Or you can put the URL of the "upload machine" into the form tag (which means you will have to buy a new domain name so you could route to that site, it will actualy be a differecnt PHP site but see later about dub domains). You might get some cross-domain attack issues here, don't know.
So if you do implement a new machine you are correct you will have to get your customer to authenticate, you could use PHP sessions across domains if you implement database handlers rather than the default local file system solution, if you have a farm you will have done this i would think. This means you have to register the handlers on both sites to point at the same database. You'll have to get the session ID to both sites for this to work which will mean the cookie will have to be in scope for both sites. If your main site is called you could have a subdomain called (which would solve having a new domain) and set the cookie scope to it gets sent to all domains that end in fred.com.
You might be able to mess about with ports on the web server i.e. don't use the default port 80 for uploads. This might mean a fiddle about with Apache or IIS to get that to work.
Anyhow something for you to go at !
 
Uploading files is just a form of a POST request. So it is the web server that does most of the work. Like ingresman pointed out, it is probably wiser and easier to load-balance the entire site if you need it. If you don't need it, do not make things overly difficult for yourself.

In other words: program like there will be no load problem, and let system administrators implement load-balancing if needed.

+++ Despite being wrong in every important aspect, that is a very good analogy +++
Hex (in Darwin's Watch)
 
First of all, I would like to thank you both for your posts.

Ingresman, you are not teaching me to suck eggs, this it totally new to me and I appreciate your comments. After reading your post,I decided to research server farms and came across the following document on the web I know its not a php installation but just helped me understand what you were talking about. I think once you go down this road things becomve very complicated, basically, say you have three servers dishing up the site one fails but the others carry on going. Session info needs to work across all the servers so I would imagine its kept in a database. With this, I still dont understand how we get uploads to use a certain server.

Your comments "You'll have to get the session ID to both sites for this to work which will mean the cookie will have to be in scope for both sites. If your main site is called you could have a subdomain called (which would solve having a new domain) and set the cookie scope to it gets sent to all domains that end in fred.com.
You might be able to mess about with ports on the web server i.e. don't use the default port 80 for uploads. This might mean a fiddle about with Apache or IIS to get that to work."

... this seems like the way I have to go. Do you or anyone else have any more info on this, I kind of have an understanding but its not 100%.

Thanks again for both your your comments.
 
You might be on your own but to target a particular server your <form> tag will look like:
Code:
<form method='POST' enctype='multipart/form-data' action='[URL unfurl="true"]http://www.youruploadserver.com/youruploadpage.php'>[/URL]
In effect the upload will execute the page at the action attribute, not the name of the server. The other attributes i.e. method and enctype have to have the values shown.
 
First, are you prematurely optimizing or are you having a real performance problem?

If you load-balance sites, it just means that all traffic gets divided between more than one server on the server side. This is usually done in some smart fashion, so any client will stay connected to the same server until the session closes. There's no need whatsoever for a separate upload domain or subdomain. But if you do load-balancing, that will mean that the files will be uploaded to any one of the servers and therefore be not available on the others unless you make sure that the uploaded files are put on a shared (internal) network location.

+++ Despite being wrong in every important aspect, that is a very good analogy +++
Hex (in Darwin's Watch)
 
This is usually done in some smart fashion, so any client will stay connected to the same server until the session closes.
Hope not, sticky sessions are unusual and woud only be needed if you had some shared memorry cache somewhere.
I would say most load-baalnceing is eithe by round robin or by seeing how busy a serveris. I've never heard of the same server being targeted.
 
Thanks once again for the posts. DonQuichote If we have 3 servers with load balancing and we have a central shared storage system and say we have 8 people uploading huge files. How is this going to affect someone who is accessing our ticket system? (That require access to the dynamic pages only) Surely these 8 people are going to be chewing all available bandwidth and processing power? Please let me know your thoughts. Thanks again to both you and Ingresman.
 
this is an interesting discussion. hope it's ok if i wade in.

for database replication the classic model is a single server for non idempotent transactions and a distributed model for idempotent queries. the central server (on which all non-idempotent transactions are transacted) then acts as the replication master. i would think this is also optimum for file system replication as well, although many-to-many replication solutions do exist for filesystems (whereas are a right pain in the a** for databases).

if you are concerned about session management between different domains then it's time to migrate sessions to a database. this is wholly trivial and plays nicely with the concept of a replication master. there is an FAQ for database session management in this forum.
 
Thanks Jpadie for your comments. Looking at my last post, my concern is that uploads will use all available bandwidth and resources and leave no room for browsing and data base requests. DonQuichote mentioned that "This is usually done in some smart fashion, so any client will stay connected to the same server until the session closes." So it looks like I wont have to worry about sessions, I need to work out how to control uploads/downloads versus normal browsing and database request.

Thanks again to everyone who has contributed on here :)
 
I think all things are possible but I can't help thinking that you are painting yourself into a corner by overcomplicatinh whenr you don't know if you will actualy have issues, premature optimisation can add complexity.
Reading your reply about 8 servers chewing through avaiable bandwidth etc, modern webservers can service many hundereds of simultainious threads so unless you are running HUGE queries I can't see why you would have issues.
could be wrong
As JPAddie says, interesting discussion.
 
I would like to thank everyone who has contributed to my post. Thank you
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top