Re: Question on Socket Timeouts
On 18Nov2012 03:27, Abhijeet Mahagaonkar <email@example.com> wrote:
| I'm new to network programming.
| I have a question.
| Can we set a timeout to limit how long a particular socket can read or
On the socket itself? Probably not. But...
| I have used a settimeout() function.
| The settimeout() works fine as long as the client doesnt send any data for
| x seconds.
| The data that I receive in the server after accept()ing a connect() from a
| client I check if the client is sending any invalid data.
| I'm trying to ensure that a client sending invalid data constantly cannot
| hold the server. So is there a way of saying I want the client to use this
| socket for x seconds before I close it, no matter what data I receive?
Not the time you set up the socket, or when you accept the client's
connection. Thereafter, ever time you get some data, look at the clock.
If enough time has elapsed, close the socket yourself.
So, not via an interface to the socket but as logic in your own code.
Cameron Simpson <firstname.lastname@example.org>
Their are thre mistakes in this sentence.
- Rob Ray DoD#33333 <email@example.com>
|All times are GMT. The time now is 10:56 AM.|
Powered by vBulletin®. Copyright ©2000 - 2013, vBulletin Solutions, Inc.
SEO by vBSEO ©2010, Crawlability, Inc.