Hi all, i just wanted to explain a network and was wondering if you think it is possibly overworked...
There are 4 main switches, each 8 ports and all gigabit switches....
The network has approx the following:
10 IP cameras
2 Servers one for cameras one for file sharing and printer management
2 card machines
5/6 phones connected to wifi for simple web browsing
I have heard people say that IP cameras are very bandwidth intensive and i was just wondering if you think i am putting too much on the network.
The switches, and computers and servers all have gigabit NIC's.
I have attached a picture of the CCTV server which runs the cameras 24/7. the NIC shows about 2.5% usage of the Gigabit card...So surely if that server can handle a constant packets from 10 cameras whilst only having around 2.5% network utilization everything should be OK? Is this correct or am i looking at it all wrong?
See attached image.
Edited by David Ashcroft, 20 December 2013 - 05:39 PM.