Hi,
I am new to the forum but hope you can help me with some general network diagnostics.
I work for a support company who provides software, i don't wish to go into too much detail about the industry, or what the software does too much.
The software runs on an ISAM database, the databases can range from 500mb to 15gb on the larger end, and the database is generally thousands of records.
As you operate the software, many records from many different tables are loaded and cross referenced, they work together in harmony
Generally we have a host pc, which in most cases is dedicated.
Some business will generally have between 5 and 8 Pcs but some have 20.
We rely on external IT companies to repair any faults that we might identify, and the business use them to seek general IT support/advice.
I am finding it very difficult to trust a lot of the IT's opinion on if the network is as good as it looks.
Due to my lack of network diagnostics and testing I am limited in what i can do, i generally will run constant ping connections looking for time outs, and transfer large files, and look for around 80% network throughput.
Here is my dilemma, Our software is not accessing large files, it's accessing many small records.
I have many business that use our software and have absolutely no problems what so ever, and then on the flip side i have others that have 10 seconds of loading between clicks, or may even crash altogether.
I'm finding it difficult to establish any differences between them.
In the cases where business are having slow loading, it is always fine on the server (accessing local files)
In the cases where business are having slow loading on the workstation, if the server is temporarily to that workstation, the workstation is really fast, again because it's accessing local files, which to me confirms the specification of the machine is fine, and once again likely network.
Now you understand my scenario and setup i can ask my question sorry it is a drawn out post
Is there more to testing a network? What about Hubs/Switch points?
How can i test that the network can handle lots of small data requests?
Is it possible that when transferring files through windows networks, there is a fault tolerance/correction? ---- What if my software is not as good as windows, and if a few packets are missing / corrupt it crashes? Could it be if windows does receive corrupt data, it can re-request it? Where as my software cannot detect that fault tolerance?
Any help would be appreciated
Regards
Edited by Html33, 23 March 2018 - 06:06 PM.