As some of you who follow me on Facebook, Twitter or Google Plus know, I frequently share articles on cyber security and the potential threat of infection. Malware of all kinds are appearing and they run rampant on the internet. Authors predict cyber attacks will result in everything from identity theft to the annihilation of mankind.
On any given day you can read about new threats discovered, a variant of an older piece of code modified to be even more clever and evade detection. These programs hide in the background of computers, poking and prodding, collecting information and even delivering the electronic goods to the bad guys.
Companies and governments are very concerned that cyberspace is the new battleground and increasingly sophisticated malware is the new weaponry. Corporations are afraid of losing trade secrets and governments are afraid of losing control.
Programs like the now famous Stuxnet have been devised and successfully targeted to another country where it caused weapons related manufacturing equipment to self-destruct.
Companies spend millions of dollars to erect solid defenses including firewalls and various intrusion detection systems. Every computer is outfitted with malware detection updated regularly to ward off the most recent threats. Government networks are even more secure with no physical connections to the outside.
But the chain is only as strong as its weakest link, and the weak link will always be the people using the computers.
I am continually amazed at the level of investment in hardware, software and the cost of remediation, as compared to the paltry amount of education delivered to employees and the public. We're frequently warned about diet and exercise, smoking, drinking and drugs. Yet, have you ever seen a public service announcement about the latest zero day exploit? A breakout of the flu or a bad batch of canned peaches will be plastered all over television news.
Perhaps we need an "amber" alert for computing systems. How about a couple of corny slogans such as "if you see something behaving oddly on your computer, say something," or perhaps "always make sure your memory stick is virus free before inserting it in another person's computer."
If there are as many serious threats out there as one would be lead to believe, it is going to be incumbent on each and every computer user to be fully versed in how to avoid threats, how to spot potential problems and what to do to quickly alert others when they are discovered.
Computer security should be addressed in the same way as public health. Teach everyone how to engage in safe computing, how to obtain safe and effective remedies, and how to avoid spreading the disease once they have it.
Captain Joe
Follow me on Twitter @JPuglisiLLC
Sunday, January 27, 2013
Sunday, January 6, 2013
An Ounce of Prevention
In my many years as a technology professional, one of the worst trends I have observed is the preference for the quick fix. This is the patch, the work-around, the extra step or two that compensates for an otherwise flawed process.
There is an old adage that suggests there is never enough time to do it right, but there is always enough time to do it over. This is clearly now the order of the day.
How many times does a road crew have to fill the same pothole before realizing the street has to be repaved? We have become so adept at creating ways of avoiding problems, we forget to back and fix their root cause. One of my favorite road signs is the one that says BUMP ahead. If you know there is a bump, don't hang a sign, fix it!
This mindset carries over into technology. Apparently it is easier to dump raw data into a spreadsheet and massage it until all the missing or incorrect values have been resolved. But do we ever take the time to track back to the source of the bad data and put new processes in place to avoid storing them in the first place? No, instead we dump the same flawed data month after month into a spreadsheet. In fact, we build macros to automate the correction process.
I've observed in some of the new programming tools we have lost the ability to check a return code. Those of you who may have written programs will recall these special variables set to specific values after an operation. A return code of zero (0) usually meant success, while other values would indicate a reason for the failure. Looking at these codes would enable you to take the appropriate action to recover from the failure or warn of bad results.
Even if error codes are available, often it appears they are not being used. It makes me wonder if error checking gone the way of memory optimization. Now it only matters that the process ends, whether it did what it was supposed to or not.
An employee of mine many years ago reported that he had completed his assignment to write a piece of code. He had entered and compiled it successfully, loaded and executed it. It ran, he told me, so he was done. He then went on to mention it produced the wrong results, but he still considered his assignment complete. You can't make this stuff up.
We have to return to the discipline of getting it right the first time, or addressing the source of problems. Let's not continue to focus on mitigating the symptoms, let's instead get in there and cure the underlying disease.
Captain Joe
Follow me on Twitter @JPuglisiLLC
There is an old adage that suggests there is never enough time to do it right, but there is always enough time to do it over. This is clearly now the order of the day.
How many times does a road crew have to fill the same pothole before realizing the street has to be repaved? We have become so adept at creating ways of avoiding problems, we forget to back and fix their root cause. One of my favorite road signs is the one that says BUMP ahead. If you know there is a bump, don't hang a sign, fix it!
This mindset carries over into technology. Apparently it is easier to dump raw data into a spreadsheet and massage it until all the missing or incorrect values have been resolved. But do we ever take the time to track back to the source of the bad data and put new processes in place to avoid storing them in the first place? No, instead we dump the same flawed data month after month into a spreadsheet. In fact, we build macros to automate the correction process.
I've observed in some of the new programming tools we have lost the ability to check a return code. Those of you who may have written programs will recall these special variables set to specific values after an operation. A return code of zero (0) usually meant success, while other values would indicate a reason for the failure. Looking at these codes would enable you to take the appropriate action to recover from the failure or warn of bad results.
Even if error codes are available, often it appears they are not being used. It makes me wonder if error checking gone the way of memory optimization. Now it only matters that the process ends, whether it did what it was supposed to or not.
An employee of mine many years ago reported that he had completed his assignment to write a piece of code. He had entered and compiled it successfully, loaded and executed it. It ran, he told me, so he was done. He then went on to mention it produced the wrong results, but he still considered his assignment complete. You can't make this stuff up.
We have to return to the discipline of getting it right the first time, or addressing the source of problems. Let's not continue to focus on mitigating the symptoms, let's instead get in there and cure the underlying disease.
Captain Joe
Follow me on Twitter @JPuglisiLLC
Subscribe to:
Posts (Atom)