Tag Archive: optimization


Is It Time to Upgrade IT?

Today, a trip to your local computer store to buy a new PC can be an eye-opening experience. Notebooks, for example, come in many shapes and sizes: desktop replacement, notebook, sub-notebook, and netbook. The relative processing power of PCs also varies greatly. This may not matter to users who just want to use their PC for basic office applications and web browsing. However, power users that never want to wait for their computers still care very much about processing power.

When PCs were first introduced, it was far easier to understand the relative processing power of workstations. Each new PC model that was released was a quantum leap beyond its predecessors. If you were around back then, you may remember XTs, ATs, 386s, 486s and 586s. Thanks to those classifications and the relative clock speeds, you didn’t need to be a rocket scientist to determine the approximate speed of one of these PCs.

You knew when to buy a new one.

In later years, other speed-related issues became increasingly important: memory, hard drive, front-side bus, hard drive interface, PCI Express slots, USB version, hyper-threading, multitasking, operating systems, et cetera. Somewhere along the way, the ability to easily differentiate the relative power of various PCs became blurred. Today, even technology experts have to scrutinize specific benchmarks to be sure of exactly what processing power they are getting. The difficulty lies in understanding your primary applications’ infrastructure needs, knowledge of potential bottlenecks and the vast array of available choices that can satisfy your business requirements.

Best Practice
Many firms in the financial industry regularly replace their equipment after two or three years of use. This strategy has as much to do with leasing and depreciation as it does with proactive maintenance and a commitment to technology standards. It is considered a best practice to replace equipment that is older than three years. This practice provides an opportunity to implement more efficient technology, limit future maintenance costs, and reduce the risk of catastrophic system failures. Though we occasionally see firms stretch equipment into a fourth or fifth year, we don’t recommend it.

Our advice is to establish a regular routine for replacing equipment, with priority on shared resources. For instance, a firm might replace all servers every two years and workstations every three years. As game-changing technology emerges, we also make additional recommendations for purchases when appropriate.

Simplifed Hierarchy of Processing Speed Factors

Assessing your systems

For business applications, the most important factors in determining yoursystem’s operating speed are CPU, memory, hard drive, and operating system (OS).  Internet bandwidth and network speed also contribute to how fast your systems process data.   In the remainder of this article, we will take a closer, slightly more technical look at these individual factors, offer some specific recommendations, and give you instructions on how to evaluate certain components.  A software program can effect your perception of system performance too, but we won’t be getting into that.

In order to get a more comprehensive evaluation of your individual systems, you can download a trial of the Passmark’s Benchmarking software and see how your machines compare with other users’ benchmarked systems:
http://www.passmark.com/products/pt.htm

CPU
Passmark’s extensive database benchmarks over 1,300 CPUs . Some are specifically designed for virtualized server environments, while others are designed to maximize the battery life of notebooks. Understanding where your current CPU fits within the benchmarks will help you glean what type of benefit you would see from a faster processor.

 Assuming you are using a Windows operating system, you can identify the processor your PC uses by holding down the WINDOWS key and pressing the BREAK key, which is usually in the upper right corner of your keyboard. Once you do this, you will see text similar to what is shown below:

Look for the line that identifies your processor, then click on the link below and see if you can find your processor on one of the lists.
http://www.cpubenchmark.net/

Using this resource, you should be able to compare the benchmark scores of your processor to those of prospective new PC replacements and approximate the relative processing speed gain.

When purchasing new PCs, we prefer to buy the fastest processors we can without paying an unreasonable premium. We expect the cost to be relatively proportional to the processing speed of various CPU options; we might pay 15% more for a processor that is 20% faster, but we would not pay 66% more for a processor that’s only 10% faster.

Memory
Memory is relatively cheap. Accessing information from random access memory (RAM) rather than hard drive space or network storage is ideal, since accessing RAM is much quicker than pulling data from your hard drive or network. PCs running XP should have 3-4gb. XP cannot access all 4gb, but typically uses a little more than 3gb. Machines running Windows 7 should have at least 4gb, or even better, 8gb. In some cases, you can add 8gb of memory to an older PC for a little as $100.

For optimal performance, memory speeds should match the maximum supported by your PC.

Hard Drive
Buy the fastest hard drives you can afford. You are unlikely to regret it. We have long enjoyed using Western Digital’s Raptor drives (10k RPM) on our workstations. More recently, we have selectively switched to OCZ’s Solid State Drive (SSD).

The link below will take you to Passmark’s list of benchmarked hard drives:
http://www.harddrivebenchmark.net/

Hopefully, you can find your workstation’s hard drive in the “High-End Drive Chart.” If you cannot, you should strongly consider upgrading it to an SSD drive because:

1. SSDs use 80% less power.
2. SSDs are silent.
3. SSDs are much faster than traditional hard drives. (An OCZ Vertex 2 SSD drive is about twice as fast as a 10k Western Digital Raptor drive.)
4. SSDs are more durable, and reliable.
5. SSDs are affordable. An 80gb drive, which should be enough for most workstations, costs $150.

If you want to compare your current hard drive’s benchmark to drives, with which you could replace it, open up Windows Explorer by holding down the WINDOWS key and pressing the “E” key, then right-click on your C-drive, and select properties. The hardware tab should contain the model number of your hard drive, and using this information you should be able to find the benchmark of your current hard drive.

 

Operating System
In the investment business, the reliability of systems is paramount. Selecting the right operating system for your workstations may be one of the most important things you can do to improve systems infrastructure. The majority of RIAs have been stuck on Windows XP for quite some time. Torn between staying on what works with all their existing software and switching to the latest Microsoft OS, many have done nothing.

Vista was a nightmare for early adopters. We upgraded our best system, when it came out, and it subsequently became dedicated to IE browsing and Office 2007 use. In all other respects, it was a pain.

In contrast to Windows XP and Vista, Windows 7 is a rock-solid product. We have been using Windows 7 Ultimate (64-bit) heavily for about a year. Configured with 4gb to 8gb of RAM and high-end hard drives (the SSDs and Raptors mentioned earlier), we have yet to see these systems seize up like Windows XP and Vista might. They consistently and fluidly respond to user requests.

When Advent Software proclaims support of Windows 7 with Axys, we expect that many RIAs will finally upgrade to Windows 7 Professional. Before you decide to move to Windows 7, you should verify that all of your software is compatible with the specific version of Windows 7 you intend to implement.

Choosing the right Network operating system (NOS) is also extremely important. A large number of firms are still using Windows 2003 Server, but they should be planning on migrating to Windows 2008 Server R2 within the next year. The prevalence of DR sites make switching an RIA’s NOS a more complicated and expensive venture, but newer systems offer valuable features such as increased security and integration with Windows 7 providing meaningful incentives to upgrade.

Upgrading the “brains” of your IT infrastructure needs to be carefully planned, scheduled and executed to ensure a successful outcome. In place upgrades of mission-critical servers are an absolute “no-no” without redundant systems to fall back on.

The best practice for systems that aren’t virtualized is buy new equipment with the new NOS for your primary site and your DR site. Virtualized systems offer more flexibility. The ability to store server images allows you to easily backup virtual machines, and revert back to a previous image if necessary.

Internet Bandwidth
Sometime users mistake slow Internet access as slow processing speed on their PC. Identifying these problems correctly is an important part of assessing the speed of your systems.

You can use the link below to test your Internet speed, but in order to get a truly accurate reading you will need to be the only user connected to the Internet. In any event, this test should give you a general idea of your Internet connection’s upload and download speeds.

http://www.speakeasy.net/speedtest/

If you are experiencing a processing problem on your system, try running this test to see what your upload and download speeds are at the time.

Domain Name Server (DNS)
When you type a URL into a web browser, the domain name you type needs to be resolved to an IP address in order to download the information to your web browser. By default, a DNS provided by your Internet Service Provider (ISP) handles this. If you haven’t already done so, you should consider establishing a local DNS server to accelerate domain name resolution.

Network Speed
Network speed is critical for clients that do processing-intensive work on their PCs. Firms using flat-file programs like Axys can see a dramatic improvement in processing by upgrading their LAN technology, but firms that utilize client-server databases locally or cloud-based apps may not.

Gigabit Ethernet (1G) is the standard. Ten Gigabit (10G) Ethernet is available, but with an estimated entry-level hardware cost of $1,500 per user (based on 24 users), the technology is cost-prohibitive for small to medium-sized RIAs, and typically found in enterprise server rooms not small and medium-sized businesses. To be implemented in most office environments special cabling (category 6a or category 7)  is required.  With the future in mind, those moving into new office space should consider paying the premium to install category 6a  or category 7 cabling instead of category 5e or category 6, but do their own cost-benefit analysis.

There are situations where decentralized use of 10G Ethernet could make sense (e.g. an Axys user with more than 10,000 accounts), but most firms will wait for the cost to come down to a more reasonable level. Since faster localized data processing is in demand at the enterprise level, prices may remain where they are for some time.

Many notebooks still do not have gigabit ports. If you are shopping for a notebook make sure it has a Gigabit Ethernet port. If you still haven’t standardized on Gigabit Ethernet at your office, you should be able to, do so at a hardware cost of less than $75 per user.

New systems or new parts?
The best configuration for your new workstations and servers is an affordable one that you never have to upgrade during the useful life of the equipment. While some of the recommendations we have made in this article can be applied individually, it is usually more cost-efficient to buy new equipment that has the right configuration of OS, memory, CPU and hard drive.

Before you spend money upgrading older technology, find out how much your existing equipment is worth. If you aren’t certain, you can look it up on eBay and see what the approximate replacement cost is. This is usually a good indication of how desirable your equipment is as well as its relative processing power by today’s standards, and may validate further investment in the equipment or help solidify plans to upgrade to new equipment in the near future.

About the Author:
Kevin Shea is President of InfoSystems Integrated, Inc. (ISI); ISI provides a wide variety of outsourced IT solutions to investment advisors nationwide. For details, please visit isitc.com or contact Kevin Shea via phone at 617-720-3400 x202 or e-mail at kshea@isitc.com.

Unless your computer systems have been severely neglected, the time required to run Axys reports on any given day is unlikely to disappoint most users. Axys taps data and produces most de facto reports promptly. Axys users with less than a hundred portfolios are unlikely to care about processing speed issues, but for those of you that have several hundred or thousands of portfolios, the following performance enhancing tips for Axys can shave hours or perhaps days of processing time at quarter end:

1. Reign in Overzealous Virus Scanning
Norton Antivirus and other Antivirus products are engineered to perform real-time virus checks on all files every time they are accessed. While this functionality makes a lot of sense for office documents that could harbor viruses and the bulk of your company files, repetitively checking Axys file types is unlikely to uncover any viruses. In order to speed processing, disable real-time Antivirus checks on either selected folders like Axys3 or specific file types — like CLI files. It is still prudent to scan all files for viruses in a regular evening process. It just doesn’t make sense to check them each and every time they are accessed.

2. Optimize Network Configuration
In this day and age we still occasionally run into companies that have obviously spent plenty of money on technology, but overlooked the network needs of their investment operations area. IT consultants and/or in-house staff need to understand that Axys is not a client server application. As such, processing takes place on a user’s local machine, but is also dependent on server speed since files effectively need to be transferred to the local workstation so they can be processed. All workstations should be configured with gigabit network cards and connected to the server hosting the Axys application via gigabit switches. This sounds relatively straightforward, but “the devil is in the details.” If a multiple switches exist between the server and the workstations all of the switches involved need to be gigabit switches. The server should have a gigabit card. Most importantly the network cards on the workstations and server should be configured to connect at gigabit speeds.

3. Process Data Directly from the Server
If improving the network speed can enhance Axys performance, removing the network factor from the equation could further boost processing speed. In most circumstances, the operations team does not have the ability to connect to the server that houses Axys and run applications at their whim. Processing from the server can make sense in some smaller installations or in situations where the operations staff has been blessed with their own dedicated server that they have authority to use as they wish. However, processing directly from the server does not guarantee improved performance over processing from a networked workstation. It is possible that a server with poor processing power would under perform when compared with a faster computer connected to the server via a gigabit network connection. As IT professionals ourselves, we have reservations regarding this approach to resolving speed issues and do not currently believe it is a best practice.

4. Distribute Processing Across Multiple Workstations
Establishing the framework to be able to do report production in parallel empowers firms to scale their production as their business grows. If proper planning is done up front to organize the work required by various business units at a company, report production can be parceled off and done simultaneously or asynchronously as various business units demand. Proper planning entails creating the reports, scripts and macros to support parallel report production. Without the necessary infrastructure design and corresponding development, scripts and macros architected for a single user environment can’t be run autonomously as needed.

5. Buy a Faster Workstation(s) and/or Server
Buying a faster workstation should increase your processing speed, but it is conceivable that a bottleneck could exist at the server that limits the positive effect of purchasing a faster workstation. Likewise, buying a brand new server that is connected to older workstations will only do so much. When considering new hardware purchases, make sure you examine all possible bottlenecks. In our experience assisting clients with the development and integration necessary to automate their reporting systems, we found individual user’s Axys processing performance varied greatly. Recognizing the need to benchmark Axys processing speed, we created a simple report in REPLANG that allows us to quickly evaluate Axys performance from any workstation. This utility and other Axys tools are available from our website.

Feel free to download our basic speed test report (speed.rep) at http://www.isitc.com\aug or recreate it from the code listed below:

format 0
.\n\n\n\n
.Axys Processing Speed Benchmark Test\n
.v1.00 Last Modified 01-12-2007\n
.Copyright 2007 InfoSystems Integrated Inc.\n
.All rights reserved.\n
.\n
.www.isitc.com\n\n\n
#limit 10000
#accounts 0
ask #limit Enter the number of records to read?
#limit #limit 1-
.Start time $:now\n
label cc
load cli
#accounts #accounts 1+
if #accounts #limit >
goto ee
next cli
goto cc
label ee
. End time $:now\n
.#accounts read.\n\n
end

As you review and potentially implement the tips from this article, perform follow up speed checks to measure the effectiveness of the methods detailed. Test processing with our “speed report” from each individual workstation and your Axys server if possible. If all of your equipment is standardized you will likely see very little difference in the benchmarks of various workstations. In my own tests, the processing time of 10,000 records went from 46 seconds to 3 seconds, making me wonder why I didn’t optimize my environment sooner.

This article was originally published in the Advent User Group newsletter in 2007.

About the Author:
Kevin Shea is President of InfoSystems Integrated, Inc. (ISI); ISI provides a wide variety of outsourced IT solutions to investment advisors nationwide. For more information, please visit isitc.com, contact Kevin Shea via phone at 617-720-3400 x202, or e-mail at kshea@isitc.com.