Thank you for being a valued part of the CNET community. As of December 1, 2020, the forums are in read-only format. In early 2021, CNET Forums will no longer be available. We are grateful for the participation and advice you have provided to one another over the years.

Thanks,

CNET Support

Question

server throughput time

Sep 7, 2018 2:46AM PDT

Given that we have for example a 200Mbytes file and our networks provides throughput time of 2,3Kbps how can I calculate the time needed for the whole file to be sent? In a repetitive system (server) if that matters/makes sense. Is there a difference beetwen TCP and UDP considering only the part of time needed in sending the file, not establishing the connection.

Thank you.

Discussion is locked

- Collapse -
Answer
Re: time
Sep 7, 2018 2:55AM PDT

That's basic math. 2.3 Kbps = 2.3/8 KB per second = 0,2875 KB per second. Now divide 200M by 0,2875 K and you know the answer: nearly 70.000 seconds is nearly 20 hours (there are 3600 seconds in an hour).

Kbps, by the way, isn't a time value, so it's not a throughput time either. It's just a througput.

If you want to know the difference between TCP and UDP, why not measure it yourself?

- Collapse -
Re:
Sep 7, 2018 4:00AM PDT

200MB x 1024 = 204,800KB / 8 (8 bits in one byte) = 25600 Kb (kilobit) / 2.3Kbps = 11130 seconds / 60 = 185 minutes

So what is wrong with this math?

- Collapse -
You divided by 8 rather than multiplied.
Sep 7, 2018 8:58AM PDT

To go from bytes to bits, you multiple.