I did some searching in this forum and I didn't find exactly what I was looking for, so I apologize if this has been answered previously.
At work we will have two servers to be used for a web based application named Test Director. The first server will run Test Director, and the second server will run SQL Server 2000. Both host are windows 2000 Advanced Server and contain dual gigabit Ethernet adapters. I was curious if the following configuration could be used:
Server A will run test director and serve off of port 80.
NIC #1 is connected to the corporate network at 100Mbit with unique IP address.
NIC #2 is connected to Server B by direct connection to utilize 1000Mbit.
NIC #1 is connected to the house network at 100Mbit with unique IP address.
NIC #2 is connected to Server A by direct connection to utilize 1000Mbit.
(None of these hosts are visible on the net outside of the company. This is an internal project)
Does the above configuration sound feasible and logical for optimal performance given the hardware? Users will point their web browser to Server A to utilize the application. Server B will primarily be a dedicated SQL server machine for Server A, but occasional connections from the house network will be made to Server B for possible SQL server administration. (backups, etc.)
For the record, this isn’t my project at work, as I’m no expert with SQL server. I made the recommendation to try and utilize the 1000Mb NIC as a direct connection for better performance since the corporate network only supports 10/100Mb. However, if this configuration isn’t feasible then both hosts will disable the second NIC and run at 100Mbit on the house network.
I was hoping someone has worked in this environment and if you have any advice, I’d graciously accept your comments.