Ramblings on Small Business Technology

End of Windows 7

As of Jan 2020, Windows 7 will no longer receive security updates.  If, like 25% of business users, you’re still running a computer with Windows 7 (or Windows Server 2008) then the clock has started for when someone will discover and exploit a yet-to-be-discovered vulnerability in Windows 7.  And Microsoft will no longer fix it before it’s widely exploited.

Like Windows 95 before it, Windows 7 has served us well. But time marches on and the desktop Gods have determined that Windows 10 will be the new King of the Desktop. Some of it is branding, some of it is the desire to move everyone to a subscription model, and a small part of it, yes, is about a better product.

Windows 10 Is Out. Do You Need It?

Windows 10Now that Microsoft has officially launched Windows 10, there will be a big push to get users of previous versions to upgrade to the newest Windows.  Microsoft is offering a “free” upgrade to most non-business users of previous Windows versions (Windows 7, 8, and 8.1, but not XP and previous). In addition, Microsoft is (literally) pushing the upgrade on-line, through it’s Windows Update service.  If you’re eligible for a free upgrade, they have by now pushed an update to your computer that advertises the free upgrade (in the form of a small windows logo in the system tray next to the clock) and entices you to register to receive it.  Should you?

If you’re on Windows 8.1 (or – shudder – still on Windows 8), you should absolutely take the leap to Windows 10. Windows 8 was such an unmitigated disaster (which the 8.1 upgrade only marginally fixed) that the jump to 10 is a no-brainer.  Technically, the system requirements are the same (and possibly even better for RAM requirements), so there should be no reason that your Windows 8/8.1 system couldn’t handle the upgrade to 10).  Most of the tablet/touch functionality has been preserved or improved, and the desktop (keyboard & mouse) mode has been restored to something that’s usable (which 8/8.1 wasn’t).  There are also other improvements, such as the new Edge browser, which shows promise to finally kill the beast that IE has become. Pull the trigger on the upgrade now!

If you’re still on Windows 7, it’s not as clear a decision.  Sure there is the bright, shininess to W10, but there is also a significant user retraining (arguably, much smaller than W8/8.1).  If you’re so used to doing things a certain way (and that way works for you), then W10 might change that a little or a lot – and for what benefit?  Speed? Security? Additional functionality? W10 only marginally improves all of these areas.  For most W7 users, there is not a compelling reason to upgrade (just yet).  Of course, this may change after MI would expect businesses to take much longer to push users into W10, probably as part of an overall hardware replacement schedule. icrosoft starts adding functionality this Fall.  Oh didn’t you hear? They promise much more frequent updates and new features (the equivalent of mini service packs), all released and forced down your throat pushed to your computer through the update service.

Of course, if your computer is part of a business domain, you won’t get a choice either way.  Your system administrator (people like me) and IT department will decide whether to upgrade your systems.  Oh, and those upgrades won’t be free.  Microsoft would like your licensing money, thank you very much.  Arguably, this initial version of 10 is primarily for home and student consumers (and the release date is timed for back-to-school purchase).  Many necessary business features aren’t included and will be coming later this year (or even next year), so this W10 really isn’t ready for business yet. I would expect businesses to take much longer to push users into W10, probably as part of an overall hardware replacement schedule.

So if you’re buying a new system, Windows 10 is just fine.  Have a current W8/8.1 system? By all means, get the upgrade.  Still on Windows 7?  I’d hang out awhile and watch what happens…

Backup and Recovery

One of the biggest questions that I get from customers is about backups.  Systems are built on data, and data lives on storage media (usually hard disk drives), and storage media has a known failure rate.  It’s a known fact that storage media fails, and when it fails, bad things happen. It’s important to differentiate between system and media failures (which are components that just go bad) and Disasters (which are things that happen to your business, such as tornadoes or acts of god, that are independent of single failures).  Disaster Recovery is a whole separate topic, which I will cover at a later date, which is equally important. But now back to system backups…

I learned the hard way, a long time ago, with an ancient Mac laptop and school work.  Disk failures happen, and when they do, you’re screwed unless you have backups.

This leads to the rule: always have 2 copies of your data.  One of live data, and another of a recent backup.  How recent? There is the question.  How much data can you recover from reentering transactions and/or manual recreations of entries that you’ve already done?  If it’s a few dozen, then probably not a big deal.  A few thousand, then it’s not so easy.

This leads to the first term in backups: Recovery Point Objective (RPO).  RPO is how long you can stand to manually recreate transactions that have already happened.  If your RPO is 4 hours, then you only expect to have to re-create 4 hours of transactions after a data restore after a failure.

The other consideration for backups is Recovery Time Objective (RTO), or “how long does it take to restore the system to the RPO”.  After a failure, how long does it take to obtain the backups, obtain restore media, and do the restoration? Data Restoration typically takes hours, and sometimes longer, so RTO is an important consideration.  If a short RTO is required, then a system that stores duplicate data is usually warranted.  You replace failed storage A with backup storage B, and you’re done.  Quick, but expensive.  If you can stand a longer recovery time, then you live with a backup drive that takes minutes or hours to restore to other media. While the restoration is in progress, the systems are off line and nothing is happening.  After the restoration, the system still has to be recovered with the non-backed up transactions.  Therefore, the true recovery time will be: time to detect the failure + time to restore the backup + time to reenter the transactions since the backup.  That could be hours or days. How long can you be in limbo?

There’s a serious cost-vs-time tradeoff for backups.  Rest assured, 10 out of 10 systems will fail.  The question is when.  Are you prepared?

Cloud Phone Systems

In the past, I’ve been an advocate for both traditional POTS on-premise phone systems (PBX) as well as internal/hybrid/external VoIP phone systems.  Each type has its own set of advantages and disadvantages, and my recommendations to customers really depend on their needs and level of sophistication.  Some can embrace VoIP while some need to stay with the tried-and-true POTS.

Recently, I had a customer move from one office to another temporary office.  Knowing that the move was going to be somewhat temporary, until a new permanent office build-out could be completed, and having dealt with the existing provider (cBeyond and an on-premise Nortel CICS), I recommended that the customer move to a complete cloud-based VoIP service.  I made this recommendation based on the following:
– I knew the broadband at the temp office and the new permanent location was excellent.
– The people in the office needed advanced features and could embrace the change in phones.
– I had “dogfooded” the system for a few months and knew it would work.

After evaluating several cloud-based phone providers, I chose RingCentral for the customer based on my good experience with them and the feature set.  Deployment was straightforward as was provisioning new Cisco VoIP phones (RC supports many Cisco and other models out of the box).  The biggest hassle was the porting of the existing numbers from cBeyond, due to cBeyond requiring that all numbers be dispensed with prior to account closure (you must kill any not-wanted numbers prior to the port or it will be denied).  After that little hiccup, in a week, all the numbers were moved to RingCentral!

The customer sure seems to love the new RingCentral solution.  The office people adapted to the new phones easily with simple training, and they really like the companion PC-based RC desktop app that shows the call log and allows easy blocking of telemarketing calls.  The bonus features of Fax-to-email and multiple outbound voicemail greetings have been a business benefit as well.

Did I mention that the business owner is happy because of the cost savings?

The Trend Towards Server Virtualization

In the last few years, Server Virtualization has taken off.  Many business people are confused by exactly what Sever Virtualization means, but to IT people, it has been a huge benefit.  It’s also a huge benefit to business in general, but often under appreciated.

What is it and why is it such a big deal? Server Virtualization is basically a technology that allows traditional servers (hardware + operating system + applications) to be “encapsulated” and run in conjunction with other encapsulated servers on common hardware.

Why is this a big thing? Well, due to Moore’s Law, hardware (and specifically server hardware) has increased in capacity and performance far faster than the software required for most business applications.  This results in most traditional hardware servers (with the increased performance capacity) being under-utilized, with performance utilization in the 10-20% or below range. Meanwhile, they are still consuming valuable power and cooling assets and taking up valuable rack space, not to mention capital costs for the hardware.  Why can’t you just combine various applications on the same server? Well, due to the various software requirements of applications, many vendors won’t support or certify performance or operations when co-mingled with other non-certified applications.  The result is that, de-facto, each application must be installed on it’s own server, alone.  The result it that many CPU cycles are wasted.

Server Virtualization technology allows several “virtual” servers to be executed on common hardware, using common hardware resources, and “time-sliced” in such a way that each server thinks that it’s on its’ own hardware, but in reality the CPU cycles are combined, resulting in a much higher utilization for the physical hardware.  Using a single high-performance hardware, you can host several (sometimes up to 50-100) virtual servers. This results in many fewer actual hardware servers, each running at a much higher utilization and using the available resources much more efficiently.

I have found that, even in very small businesses, server virtualization has huge benefits, and many additional uses can be realized with no (or very limited) additional hardware purchase, which is high on the minds of SMB owners.  These include:
– Backup and Recovery is easier and more consistent
– It’s easy to try out new applications without affecting production systems
– Disaster Recovery is possible since everything is encapsulated and fairly easily transportable to different hardware
– Remote management is easier (note: I’ve even got a failed server up and running again with my iPad at 35,000 feet on an airplane)
– Hardware management is simplified and hardware reboots are infrequent

Since the existing virtualization technology “encapsulates” the virtual server into a set of very few files, it makes the virtual servers much more portable and robust.  Backups of servers are accomplished by backing up as few a eight files (configuration, logs, memory, and virtual disks) rather than a whole machine of thousands on separate files.  These are self-contained, and can be restored to similar (but not identical) hardware at a moment’s notice.

I have “guinea-pigged” the technology on my own business systems for some time now, and subsequently I have converted most of my small business customers to Server Virtualization, and that has resulted in improved relaibility and efficiency for the customer, and much less headache in manageability for me.

The technology, while fairly mature in the past few years, has become rock solid, with very little down time and outages.  If there is a problem with the underlying hardware, recovery is much faster to alternative assets due to the abstraction layer of the physical hardware.

It’s a win-win for both the business and the IT contractor, and I encourage you to consider and embrace it!

Do You Need VOIP (Voice Over IP)?

We’ve had the PSTN (Public Switched Telephone Network) – our existing wired telephone system – for a bit over 100 years now.  It’s reliable, simple to use (even old people have figured out touch tone and caller ID), and for the most part works great.  Why change?

For one thing, it’s getting too expensive.  With the advent of ubiquitous cell phones and free nationwide long-distance, the traditional phone companies have to get very creative in finding ways to tack on more fees to compensate for the lost revenue from people dropping their phone lines and long distance service.  With less people installing and maintaining wireline phones, that cost must be spread over less people, with more costs per line. Most of my small business retail clients have 4-5 phone lines, which average over $300 per month.  That’s getting pretty expensive for a small business.

What is VoiP?  It’s a way of making a traditional phone call over a computer network (like the one in your office or over the internet).  Why is it important?  Most people have intra-office computer networks and most offices are connected to the internet, so you really don’t need a whole separate setup (phone lines and separate wired phones) to use VoiP.  That saves lots of money.  Also, most internet connections are not data-limited, so you’re not paying extra for the extra data traffic for your phone calls.

VoiP also provides a lot more flexibility.  Want an office phone extension at your house?  No problem, as long as you have a decent internet connection.  Want to dial another office coworker in a different city by dialing an internal extension (and not be charged long distance)?  Again, no problem.  Want a complex automated attendant script with IVR prompts?  Easy with the right VoiP setup.  All these features were available previously with the wireline phone network, but they were so expensive that only large companies had them.  Now, any SMB with an internet connection can have the same features as large enterprise companies.

What is the downside?  Well there are a couple.  One, since the internet (and your internet connection) is not as reliable as the good old PSTN, your phone will only be as reliable as your data connection.  My home DSL has only gone down a couple of times this year that I can tell, and not for very long, so this problem is getting much better.  Second, there are a few legacy devices that don’t work as well over VoiP as they did over analog telephone lines.  Namely old fax machines and POS modems.  If you’re still using dial-up credit card validation, then you’re probably going to need at least one analog line  (I really wish AmEx would move into the 20th century and go to IP validation!).  On the other side of that, I have one customer that relies entirely on IP validation and doesn’t even have a back-up analog modem, and they have been operating without a problem for over a year.

The bottom line.  If you have multiple analog phone lines or you use lots of long distance or you need those big-office features (automated attendant, remote extension dialing), you can save a bunch of money by considering VoiP. 

VoiP is coming on strong, and it is here to stay.  I predict that half of all analog phone lines will gone in 10 years and most will be gone in 20. Are you ready for the new generation of telephone?�

The Trouble with Vista

As you may have heard by now, Microsoft has terminated the sales of Windows XP (through most methods) and your only Windows desktop option is now Windows Vista.  Vista has been out for more than a year, but (other than for new home computers) hasn’t been widely accepted by the marketplace – especially by business customers.  Why? There are two primary reasons.

Reason One: Vista requires significantly more processor horsepower and memory to operate (in the “normal” mode with the jazzy graphics).  Many people have older computers with two or three year old CPUs that don’t really have the horsepower that is a good fit for Vista.  So for those computers, it makes sense to keep XP until the hardware needs replacing, but in most cases the machine (with XP) is just fine for what people need it for, so no upgrades here.

Reason Two: Vista in some cases uses a completely different programming model than did XP.  So in a lot of cases, software that ran fine on XP is broken on Vista.  Microsoft claims that the new model was implemented in the interest of security (and it probably was, since XP was VERY insecure to start with).  But with Service Pack 2 and the hotfix updates (and a decent IT staff), most XP desktops are reasonably secure for most business purposes.  Many businesses don’t want to risk having something critical break, just for the sake of the upgrade.

Many people in IT circles have hoped (and even petitioned) for Microsoft to keep selling XP, and they did extend sales for a few months (Microsoft typically sells the old OS for up to 24 months after the introduction of the new version).  But we’re only at Vista + 18 months.  Another truism: Only adopt the latest Microsoft product after the first “Service Pack” is issued (assumingly to fix all the major bugs that weren’t caught prior to the release).  So Microsoft rushed Vista Service Pack 1 to the masses in hopes of adoption after that milestone.  However, with the monthly “patch tuesday” updates, the SP1 is really a moot exercise (since most, if not all, of the updates in SP1 had already been pushed as part of the monthlies.

The real reason to stop selling  XP: Money (I know you’re shocked!).  Most people are just fine with XP, and most PCs sold in the last 2-3 years (with XP) have far more capabilities that are needed by 99% of the working public.  XP works just fine, and it’s (reasonably) secure for most purposes.  Moreover, it’s like an old friend – familiar and convenient.  We know how to make it work and find things.  We’re productive on it.

Microsoft (in a fit of Apple-related paranoia), had to “improve” the user interface of Vista – make it pretty, shiny and flashy – the “ohhh, ahhh” factor.  The problem is that scares off lots of marginally computer-literate people.  If Microsoft had just re-engineered under the hood (for security) and left the user interface the same, then it might have been a great product.  But the problem is that people wouldn’t have “seen” any changes to identify as “Vista”.  The best alternative (for the users) would have been a “XP Service Pack 4” that fixed several things under the hood, but left the UI alone.  Unfortunately, Microsoft couldn’t charge for that, since they have set the precedent that Service Packs are free.  No more money for Microsoft.

So we’re collectively stuck with an upgrade that we didn’t really ask for, we don’t really need, and we don’t want – just for the sake of Microsoft product cycle revenues.  If you need a new computer, Vista will work just fine, but you’ll be spending more that you really need for processor and memory (did I mention that the PC vendors like this too, for the same reason?).  You’ll have to learn the different parts of Vista (and the annoying “Are you sure you wanted to do that?” pop-up messages), but it will work.  But do you need it?  No, not really.

The real pain is the businesses that will be forced to operate in mixed-mode for the next several years, since there is no business case to upgrade their current PCs and yet there is no alternative to keep everything on XP until there is.  It just means more time, cost, and headaches for your freindly IT staff, who wish that we really had a choice in the matter.

FireFox 3 (or What’s The Best Browser, Part 3)

This week, the latest version of the FireFox web browser was released to much hype and fanfare.  Ignoring the hype, it’s still a major accomplishment.  I’ve been using it for almost a week now and I can tell you that I’m impressed by the increase in rendering speed and overall performance.

FireFox is another Open Source development effort (part of mozilla.org).  Interestingly, the very first web browser (Mosaic) was also open source.  It eventually was reincarnated as Netscape Navigator, the first widely available and used browser.  After Microsoft effectively killed off Netscape, the mozilla project was reborn from the proverbial ashes, and thankfully lives again.

I switched to FireFox when I got frustrated with Internet Explorer 6, which is arguably one of the worst browsers ever.  When IE7 came out (it’s amazing what a little competition will do to Microsoft), it was a significant improvement, but it still lacked the standards compliance and extensions model of FF2.  And there was a lot of standards non-compliance to get rid of.  Having been bitten by the DHTML/CSS web standards bug several years ago, most everything I develop is developed to current web standards.  I am continually amazed about how many things render differently in IE7 and more standards-accurate browsers (like FF), and consequently how much time I waste trying to make everything work correctly in both browsers.  But FF2 had its problems too – It had a significant memory leak and would slow down if left open for multiple days.

After 2+ years, FireFox 3 is here.  It’s significantly faster that its predecessor (GMail is almost tolerable in FF3) and hasn’t lost any of the rendering compliance of FF2.  What’s more, I found myself checking the extensions catalog for compatibility to see if my favorites had been upgraded to support FF3 before I pulled the trigger on the upgrade.  I wasn’t going to upgrade to FF3 without AdBlock or the Web Development toolbar!  Extensions are a big part of the overall FireFox allure.

Like I said earlier, it’s amazing what a little competition can do.  Microsoft has announced they are working on Internet Explorer 8.  While I’m sure they will ram it down everyone’s throat (through Windows Update), they better get busy – there’s a lot of ground to make up to catch the current leader!

Open Source Software

I have recently become a big proponent of Open Source software.  What is Open Source – it is software that is (usually) developed by a community of people (as opposed to a specific company).  The “open” comes from the publication and modification rights, which usually specify that the source code for the software is freely available and can be used, modified, or repurposed – as long as the result is also “open” (most Open Source projects are released under the GNU Public License (GPL)).

What does this mean for you? It usually means that the software is : 1) more robust (due to the larger number of community developers), 2) free (since no one company is profiting off the development), and 3) less platform-specific (since many developers mean many different preferences for platforms).  For most people, Linux (an open source variant of Unix) is synonomous with the open source “movement”.  But there are open source packages for many different applications.  For example,  this blog is powered by WordPress, an open source blogging package.  The “LAMP stack” (Linux, Apache, MySQL, PHP) is a popular web and application server that is all open source.  I also use OpenVPN as a way to connect back to my network (and pretty much do anything I need to do!) when I am out of my physical office.  There is also Nagios, an open source network and host monitoring system, and SugarCRM, an open source competitor to Microsoft Dynamics CRM and salesforce.com.  And these are just the ones that I have used!  There are hundreds more.

Coming from a predominantly Microsoft-based corporate environment, I embraced most Microsoft products as inevitabilities.  But lately, I’ve discovered that there ARE viable alternatives that are just as capable (and sometimes moreso, since Microsoft tends to intimidate market share rather than to lead with innovation).  There is even a very capable alternative to the 800 pound gorilla that is Microsoft Office (Open Office).

Open Source software is not always free to end users and customers, since most require some effort to install, implement and customize, as well as support.  One of the penalties (if you choose to see it that way) is that open source software doesn’t come with a “one-click” installer (like a lot of commercial software) – that’s usually because installers are usually very limiting and open source packages are usually very flexible.  Also, there are business models where companies have grown around enhancing and selling services around open source software.  That’s not necessarily a bad thing, because many of these open source model companies have products and services that are stellar in comparison to their commercial equivalents, for a fraction of the price.

I encourage you to embrace Open Source – I sure have!