Jump to content


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.

Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.


Network Diagnostics & Testing

  • Please log in to reply
2 replies to this topic

#1 Html33


  • Members
  • 1 posts
  • Local time:07:27 PM

Posted 23 March 2018 - 06:03 PM



I am new to the forum but hope you can help me with some general network diagnostics.

I work for a support company who provides software, i don't wish to go into too much detail about the industry, or what the software does too much.


The software runs on an ISAM database, the databases can range from 500mb to 15gb on the larger end, and the database is generally thousands of records.

As you operate the software, many records from many different tables are loaded and cross referenced, they work together in harmony

Generally we have a host pc, which in most cases is dedicated.


Some business will generally have between 5 and 8 Pcs but some have 20.


We rely on external IT companies to repair any faults that we might identify, and the business use them to seek general IT support/advice.

I am finding it very difficult to trust a lot of the IT's opinion on if the network is as good as it looks.


Due to my lack of network diagnostics and testing I am limited in what i can do, i generally will run constant ping connections looking for time outs, and transfer large files, and look for around 80% network throughput.

Here is my dilemma, Our software is not accessing large files, it's accessing many small records.


I have many business that use our software and have absolutely no problems what so ever, and then on the flip side i have others that have 10 seconds of loading between clicks, or may even crash altogether.

I'm finding it difficult to establish any differences between them.


In the cases where business are having slow loading, it is always fine on the server (accessing local files)

In the cases where business are having slow loading on the workstation, if the server is temporarily to that workstation, the workstation is really fast, again because it's accessing local files, which to me confirms the specification of the machine is fine, and once again likely network.



Now you understand my scenario and setup i can ask my question :) sorry it is a drawn out post



Is there more to testing a network? What about Hubs/Switch points?

How can i test that the network can handle lots of small data requests?


Is it possible that when transferring files through windows networks, there is a fault tolerance/correction? ---- What if my software is not as good as windows, and if a few packets are missing / corrupt it crashes? Could it be if windows does receive corrupt data, it can re-request it? Where as my software cannot detect that fault tolerance?


Any help would be appreciated


Edited by Html33, 23 March 2018 - 06:06 PM.

BC AdBot (Login to Remove)


#2 Orecomm


  • Members
  • 266 posts
  • Gender:Male
  • Location:Roseburg, Oregon
  • Local time:11:27 AM

Posted 24 March 2018 - 10:51 PM

Your traffic should be using TCP, which handles packet loss and corruption, and guarantees that packets are delivered in proper order. UDP does not have these guarantees. Your DB should have transaction guarantees as well. That said, it takes time to recover from lost or corrupted packets, so even if things look just fine the link may not be so great. Usually where you are going to find your clues in this case are not in most network diagnostics (unless you own and have admin access to devices in the network itself) but in the Database stats, in terms of hung or aborted transactions or sometimes more detailed errors. Unfortunately, there is no standard here so what to look for varies greatly. It's not my specialty either, but I've worked the network side for more than a few years and can tell you that a good DB Analyst/Tuner is really the one that can take the network to task - he's the one that can get the smoking gun by site and time of day. I can talk about using RMON, SNMP, and Wireshark traces to find the problem, but unless you have deep access into the network these aren't going to do you much good. Your DBA can point out that there IS a problem, and to some degree where, then its up to the network guys to track it down and stomp on it.

#3 Sneakycyber


    Network Engineer

  • BC Advisor
  • 6,136 posts
  • Gender:Male
  • Location:Ohio
  • Local time:01:27 PM

Posted 25 March 2018 - 08:42 PM

Agreed with above. If your application uses SQL databases (It probably does) then 90% of the transaction speed depends on the server and the database, both of which your DBA should be able to determine. I can also say I have gone round and round with DBA's who said it was a network issue and every time it was an issue with the setup and not the network. If you are not dropping Database connections or otherwise experiencing packet loss (dropped packets) its likely not the network. Especially with 20 or less users. 

Chad Mockensturm 
Network Engineer
Certified CompTia Network +, A +

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users