Backup, HOW

This is the fourth & last blog in my set about Backup. If this is the first time you are accessing my blog, please read the prior backup posts.

In the first installment I have written about where to backup your data, after the first Backup blog there was a hiatus with a post about computer & dust. The next one was about what to backup, the different kind of data files that need to be saved. The last one was a special Databases & Mail mention. Also, the first post in this blog was about taking care of saving your family pictures from A to Z, go to it if you did not yet take care of you digital pictures.

Right now, if you applied my suggestions about Where & What, your data should be arranged by type, you should know where your different files are located & the files themselves, renamed according to my protocol & stacked away under the related folder. If all this is in order you have accomplished 90% of the work! Yes, as in any technical work if you are ready & orderly in your work the process itself is easy.

So How do we backup all that important information?

First, I am going to differentiate between tree general kinds of users, the distribution will not follow the normal path of Home, business & bigger, but will be segmented by the number of computers available to the user, be it small business, home users or soho.

  • One computer only.
  • A Workgroup. Several computers on a LAN (A Local Area Network) without a server. That’s any computer that is hooked up to a router & can access one another on the same network. This includes NAS (Network accessed Storage) in layman terms, an external disk that can be plugged on the network and accessed by any computer on said LAN.
  • Several computers on a server based network.

Second, you have to choose the backup application, this is more difficult. There is a lot of applications out there. There are the free ones, the best free local backup application in my opinion is called SyncBack free from 2brightspark. Just initialize a search for “free backup app” you will have a cornucopia of results, go for the “Top 10” Lists & just choose the one you like the best. If you are ready to spit some of you hard earned Shekels, the price bracket is between 20 to 70$ for the normal types & that goes up very fast to 3 to 4 figures for the professional software like Symantec Backup Exe. If you go for broke & buy your software, choose one with versioning. This will permit you to address your backup much more professionally. I equate simple backup vs. versioning to taking a picture of your system once a day to having a nonstop video running & saving any changes you are making to your files. I am using the paid version of SyncBack with versioning & I am running it every 15 minutes! Yes every 15 minutes, the more you are running your backup the less delta there is the less amount of material you have to move. You can even set your software to listen for any change of your files & copy that. In my opinion it is too much of a bother & it run the CPU too much.

Third, you simply differentiate your data by the quantity. For example, I see tree thresholds;

  1. Up to 7GB overall data
  2. Over 50GB of data. In this section we are differentiating between different king of data;
  • Office or administrative information, which is low media and high text in content & should not exceed the 7GB threshold.
  • Small media files like pictures that should amount in average up to 50GB
  • Heavy media files, like videos that can reach to Tera Bytes of data.
  1. Up to 50GB overall data. As in section 2. Without the heavy media files.

To the backup schemes;

Simple one computer scheme;

One computer will back up locally first, then the data will be uploaded to the cloud. This gives us a two tiered approach. The first backup will take care of all the data with fast access for backup & retrieval. The second backup will be directed directly to the cloud, the easiest way is to open an account in one of the free cloud services and to set the first backup location to be uploaded/copied/synced to the chosen service server.

Depending on the amount of data, we will copy all the targeted data to our chosen media.

  • An external detachable USB disk
  • An internal local disk
  • A disk on key

Using an extra internal Disk is always the simplest way to ensure quick access & timely procedure. If the disk is already in the computer you do not have to remember to hook it up to perform the backup! In this case it is required to also backup, at least the administrative data to a cloud service. This is to be sure that in case of fire or theft, you will still have your most important data. If you have a lot of important data you should implement a third tiered process to an external USB detachable disk once a month & store it at another location!

You can either go for the free services for up to 7GB, or the affordable storage for up to 50GB, that was the reason for data amount segmentation.

The Workgroup;

You chose one of the computers on your network, you add a big disk on it. Install your backup software. Make sure that all other computers & NAS on the network are accessible. The data on those computers should be arranged according to the rules mentioned on the prior posts. Setup the software to copy all the relevant data to the backup disk. Each computer should have its own folder. The backup process itself should not take to much CPU, and remember, the higher the frequency of the backup (to a certain point) the less data is to be copied. That’s it you have your main backup.

As for the one computer rule, if you want to be protected from fire or theft you have to either set the backuped data to be synced to the cloud or/and copy it to one or more external disks. If you chose to have more than one disk, you will have to rotate them once a month so you always have a copy in another location. If you have a lot of data & cloud storage is to expensive, one of the possibilities is to access your network from another computer through the internet & copy your backed data to this external computer over night every day. In this case the first backup should be done locally then physically moved to the remote computer. If you have a lot of data, the first backup can take more than a few days. It’s not practical!

Server based network;

The same as for the workgroup but since all the data should already be on the server, it’s more centralized including server based mail & easier to find. In case of pop3 based mail, the data will be located on the local computers, so you still have to access the computers to backup your data.

Posted in Uncategorized | Leave a comment

Backup, Special DataBase & Mail

Last blog was about What to backup & how to prepare your data to be accessible

This week I am going to focus on Data Base backup & consequently on Email! Yes your everyday Mail Box is more akin to Data Base backup then to any other king of backup.

Data base backup is different. First you have to understand the reality behind any file access system. Any time a file is accessed either by the computer system or another user, be it on the same computer or over the network. The file itself will be locked. That’s means that another user or service cannot delete or copy the open file, in some instances, like in office, the file can be opened in read only mode. In data bases you are accessing the “file”, in fact the data, trough the data base application. To make it simple the application is the only user/service accessing the data inside the data base file & multiple users can access the data itself when using the application. Another problem avoided by this behavior; if for example you want to copy a large data base file when it is accessed by numerous users. Since the copy process is not instantaneous, there will be changes in the file between the time you began the file copy up to the time the copy process is finished. The result will be a corrupted file that you can’t use! So what can we do?

There is several ways to copy (backup) a data base file.backup folder

  • Copy the data from the data base file with the application itself. The problem is that most application do not have this function embedded & when they do have it, it’s not automatic. You can use it manually for special circumstances.
  • Copy the data from the data base file with the SQL engine, a lot better, but the problem with this solution is that the data base engine does not always know how the different data tables are arranged, so the backup result will have all the data but may not be easily accessed or restored in case of a disaster.
  • Close the application, so the users will not have access to the data base during the copy process. A lot better but no cookies for you! This system is called “cold backup”, it’s good for small businesses with small data base file that do not work 24/7. But any major company will not want such down times.
  • Shadow Copy is the answer. Shadow Copy is an external service that can copy the data base file when it is open, it will access the file trough the SQL engine & it will log any changes made during the backup, then it will copy the changes by the end of the copy process. The result is an exact working copy of the data base file. Bingo!

There are a lot of backup applications that have the Shadow Copy process built-in, for example; Symantec Backup Exec is simple but costly. For those that want to use simple file copy there are Shadow Copy application that will let you access the data base files with any copy process. So that the right way to do this. Just chose the backup application you prefer, check that it does have the possibility to backup open files & data bases then you are fine. Well that is if you want to pay 300$ or more for the ease of use. Small backup solution up to 100$ may not have shadow copy. Windows itself has a very limited Shadow Copy ability, but the implementation is arduous.

Now you are telling yourself, what does that have to do with backing up my mails?

Wait, don’t go yet. I will explain. All your local mail box local application, like Microsoft Outlook, Mozilla Thunderbird, Pegasus Mail & others will store you information on some kind of data base. Even if you are the only one accessing this information you will have to apply one of the solutions mentioned above. But first you have to copy your data to you local hard disk;

  • Your data is in a free internet mail box. Gmail, Yahoo, Walla & other like them offer free mail box services. Google profess a 99.9% uptime & generally, they are right. The problem is, what if you are one of the Point one percent. Since Google has millions of users, this small percentage is an impressive number. Do you want to count on the law of average or do you want to be sure that your data is secure? What goes for Google goes exponentially for the others free services. Another thing is that Google & other free services changes their terms all the time & the bottom line is that you are not the owner of your own data!
  • Your data is on your internet service provider server, or any other paid mail box service. You stop paying, no more data. Again, until you download your data locally you do not own your own data!

Only Gmail service will never delete your data willingly, so having a local copy of your Gmail box should be enough of an insurance. For other mail provider storing your data locally is only the first step. Let’s go over different ways to retrieve your data.

  • Most of the free mail box providers on the internet let you retrieve your data trough the POP3 protocol. This protocol will permit backup mail pop3most of the mainstream mail applications to store your data locally.
  • Yahoo Mail does not let you retrieve your data with the POP3 protocol. You will have to install a third party application to be able to copy your data locally. This app will only work with Microsoft Outlook.
  • All paying service provider will aid you in a lot of ways to access your data, either by POP3, Exchange server or other server/client flavors.

So just hookup your mail application to one of this services & it will download your data automatically. What application?

  • Outlook Express. It’s free with windows XP. It’s old & problematic. It also has a 2GB upper limit on each folder. Furthermore Microsoft does not support it anymore!        Local files default path;

    C:\Documents and settings \”Your login name”\Local Settings\Application Data \Identities

  • Office 2007 or 2010 Outlook is one of the better solutions, but it will cost you! Outlook stores your data in a PST file you have to check from time to time to be sure that it will not blot. Too big a file is never a good idea. If your PST file is to big, just open new ones & archive your older data on it.
  • Office 2003 or XP Outlook are old & will have severe size restrictions, it’s store your data in a PST file that have a 2GB upper limit. It can grow bigger than that but will be unstable.
  • All Outlook versions XP default data path;C:\Documents and Settings\”user_id”\Local Settings\Application Data\Microsoft\Outlook
    backup thunderbird-logo
  • Other mail solutions like open source Thunderbird can be used. A simple Google query about default storage location, for those application will let you the exact path

If you have an exchange account with your mail service provider, the data is stored under an .OST file that will let you access your data when offline. Either way you will not be able to copy the data file when the outlook is in use. So either execute a cold backup or use a third party application like Microsoft PST Backup tool. Any solution used will have to be updated often so the data can be retrieved & used in case of disaster.

Once you have your data copied & stored locally you can copy your File or Files to your backup chosen media (see my blog)

Next week HOW to backup.

Posted in Uncategorized | Leave a comment

Backup, What to backup.

Last Backup Post was about where to store the precious information copies, this one will be about What! How to address different kind of information.

Every System has his own Backup needs & the solution for a small business might not fit a medium company with multiple servers & active directory. A business based scheme will not fit a private person or a one computer based professional. In this post I will began by targeting the kind of data we want to backup then the one different needs.

Any users be it one computer or multiple server need to assess the amount & location of information to be copied, and then separate the different kind of data in use on the computer system. The information will have to be very specifically arranged before you can back it up. This is a non debatable requirement. You cannot backup your data if you don’t know where it is!


  • Family Pictures can be found on personal computers in private hands & small businesses, they should not be clogging business servers. Personal media i.e. pictures & films stored on local computers should be dealt with separately from business related data. On my blog about pictures I am explaining exactly what to do to save your family memories.
  • Business related pictures should be dealt exactly like the family pictures in order to be able to access them easily & to do a thorough backup, but they will be stored in the same containers as your business information. You can either use the “My picture” subdirectory, open a dedicated “Pictures” directory under your main document folder or for better retrieval, just store the pictures directly under the same Client/Project folders as the other related data. If the picture is date related it should be named accordingly to index the pictures for better retrieval. For better chronologic order the date format will be;
  • “year–month–date”, space, then the event account, space, then a running counter with two or three digits depending on the number of pictures.
  • Architect Example; 2011-11-04 Jaffa street 19 front building 01
  • If you do not know how to batch rename your files or are too busy, you can store them in a folder under the same naming scheme without renaming the files themselves.
  • If you are a professional photographer you should address your files the same way has videos. See explanations in the video section.

AudioIcon audio

  • Audio recordings & not songs! Any relevant recording should already be in the MP3 format or should be encoded right away. The need for quality is minimum & we sould get very small files. The same pictures rules should be applied.

Office data

  • Office data is normally stored & addressed as Documents. Since I am blogging about backing your data & not accessing it, I will not address the kind of application used to generate said documents, but I will refer to Microsoft Office since it is the most wildly used office application in Israel.
  • In Microsoft Office, letters are Documents shortened to docs. The file suffix is either .doc or .docx for latter Office versions.
  • Spreadsheets are shortened to excel & the file suffix is either .xls or .xlsx for latter versions.
  • Presentations are PowerPoint’s while the file suffix is .ppt for editable presentation & .pps for work alone ones.
  • Adobe has the well known Reader format also called PDF files.icon office
  • Scanned documents can be either picture, PDFs, or OCR generated documents (OCR- Optical Character Recognition). They are normally quite small and you need different kind of software to access them but backup wise it should be addressed as a document.
  • There are a lot of other files out there for all sorts of office related work, all of them are relatively small and consequently easy to backup. They just should be stored in the same containers has related data.


  • Medium graphics applications, like AutoCAD are used by architects, engineers & other technical businesses. The Files generated by those technical applications are bigger than office files but still small enough to be on the same order of magnitude. For example a CAD file is normally smaller than 100 MB & usually on the order of a few tens of Mega Bytes. Those file are in between & can either be treated like office file & stored under the same folders & eventually backed up to the cloud, or addressed as bigger graphic files.
  • Photoshop & likes files are quite bigger. They need to be addressed a completely different mater. To backup this kind of files you need a lot of storage & this storage always cost more. If it’s local, you have the cost of the hardware storage & maintenance. If it’s offsite you need to add the networking cost & if the remote site is not yours you need to add the monthly fees that can be very expensive. Offsite backup can be either, your home, another business location or cloud storage services.


  • Clip taken from smart phones are the smallest. They equate the big graphic files and Multimedia-Iconshould be dealt with accordingly.
  • Videotaping from non professional & semi professional Camera, they are the one that fit most in my market brackets. Their file size should vary between hundreds of Mega Bytes to several Giga Bytes at the most. First they should be indexed exactly as the pictures i.e. the format will be; “year–month–date”, space, then the event account, space, then a running counter with two or three digits depending on the number of pictures. Then the videos should be stored in a dedicated folder, apart from your other administrative data. Once the data is separated from lighter information it will be easier to build a different scheme to back it up on a special locations, be it remote or local. To be able to view the video easily, I strongly recommend to encode it with an XVID like format. This will permit to shrink the video size to up to 10 time smaller than the original. Thus permitting normal storage with the related data & easy viewing with very little quality loss. I myself use the VirtualDub application, it is very light & permits batch encoding.
  • Computer Generated video are video, and fall under the same rules.
  • Video rushes (Rushes i.e. before editing video material). In professional post-production, the ratio between the rushes and the edited final product is about 10 to 1, in semi professional editing like weddings & other such events, it can be as low as 3 to 1, for example for three hours of shooting on location you will have about an hour of edited product. Video rushes are very heavy, we are talking about Giga Bytes of materials, and they are quite impossible to backup, depending on the production wishes they are either stored on non volatile media like tape or DVD, or in case of news and other short shelves lives materials, just discarded. Only the finished product is saved & backed up.
  • Edited Videos fall under the same area has the second point in this section.

Emails & Databases are almost system data & I will address them in my next post

System backup schemes are even more complicated than data backup, I will address the needs & solutions in other dedicated posts about system redundancy, load balancing & the like. Code backup scheme, is another ballgame altogether & doesn’t necessarily interest my core reader, I might address this in my DIY SysAdmin Blog in the future.

Posted in Uncategorized | Leave a comment

Be Cool, Dust Free Computer Cooling.

In one of my last blogs I have written about Computers & dust. Meanwhile I have heard a story about a fire that occurred because of dust! The computer was completely obstructed by dust & the combination of Dust, High voltage & heat did the rest, it seems that the PSU (Power Supply Unit) erupted in flames & half the house was destroyed before the firefighters put the fire down.

Today I will address desktop dust preventive maintenance and planning.

First & foremost always buy a computer case with dust filters so the only thing to clean is

Removable Filter

Removable Filter

the filters! In this case you will not have to access the computer inner workings often. With or without filters you have to clean the Dust on a regular basis. The frequency of this intervention changes from user to user and is determined by a handful of parameters;

  • How much time the computer is On. If it’s working 24/7 or if you access it only a few hours a day can make a tremendous difference.
  • How much rugs & carpets you have in the house. Wall to wall and/or thick carpets collect a lot of dust, no amount of vacuuming will get rid of it all, when you walk on them, the dust raises from the carpet, and it’s drawn by the computer intake fan.
  • Computer distance from the floor. If the computer is placed on the floor or on the table will make a big difference in the amount of dust the drawn trough the computer.
  • House pets. Yes! Cats & Dogs, they shed their fur and the hairs get in every nook & cranny. Cats have finer hairs than dogs so they are a bigger hazard to cooled household appliances and the number of pets also mater. Pets hairs are the worst parameter since they clogs the filters or the CPU cooler fins and collects normal dust particles even quicker.

The solution to dust is simple & effective; clean it often!


  • Simple house Vacuum.
  • 1″ Fine painting brush.
  • A standard Philips screwdriver will open most cases.

Disclaimer;  The author of this blog will not accepts any liability for any system down time or breakage or bodily damage caused by unprofessional handling when implementing this blog instructions. I advise strongly to use the services of a certified professional to apply these blog instructions. Always unplug your computer when servicing it.

If you had a good computer specialist advising you when purchasing your computer, he made sure that you bought a computer with a filtered case. So if you have a filtered case the only thing to do is clean all the case filters. If the filters are removable, just gently detach them from the case & clean them with water. Be sure that the filters are completely dry before returning them in place. If the filters are not removable just vacuum them, it’s even better if you use a fine brush to help detach dust particles when vacuuming the filters. Even with filters the case is not completely sealed and a certain amount of dust will always collect inside the case on the fans and especially on the CPU cooler block fins. You will have to clean that yearly or biyearly.

If you case do not have filters, the fans will have to be vacuumed; it is always easier to use a fine brush to help detach dust particles from the fans ailerons. The CPU cooler cleaningCPU Cooler Blocked is much harder; the CPU fan will have to be detached from the CPU Cooler block to have direct access, so the dust can be thoroughly cleaned. Not detaching the CPU fan can cause wear on the fan motor when using the vacuum cleaner & in some instances the fan ailerons can even broke. It is also recommended to clean any excess dust everywhere inside the case.

A computer should be cleaned in accordance to the different parameters mentioned above. For cases that fit all of the parameters, i.e. if the computer is working all day long, the environment contains carpets & cats, and the case is on the floor; the dust should be cleaned monthly or at the least bimonthly. In other cases & according to the parameters mentioned above, the cleaning should take place every tree month, every six month, but not least than once a year.

That’s all. Clean your computer to avoid any undue wear & tears as explained in my blog about dust. Better yet change your case to a professional filtered case, you can find them at very affordable price like the one described on my review blog. Always check that when acquiring a new desktop.

Posted in Uncategorized | Leave a comment

Backup, Backup, Backup!

BACKUP is not a dirty word. To build a decent backup scheme is secondary only to setting up your system.

Today I want to spread before you all, my increasing number of readers. The different ways to save & protect your data and the importance of having one or more decent copies of your important files. There are different backup schemes for different system, from the lonely computer to small & medium businesses with 2 to 20 computers with or without servers. I will never repeat myself enough, all system need a decent backup scheme! I will only touch data backup & not system redundancy.

In one of my first blogs I spoke about “Saving the Memories“, aka backing up your family pictures. You can find there a lot of the same problems & some of the same solutions for single computer users. So please visit that blog to partake unique solutions for protecting your photographic memories.

Any data that is stored only on one media is susceptible to evaporate and never be seen again! Electronic storage is intrinsically volatile and as opposite to paper can disappear at a finger click. In the daily cases I see in my line of work, there is lot of lost Data. Specifically there are too many persons that are coming to me with their dead disk in hand, mostly the result of computer heat stroke caused by bad fan or environmental stress (see my prior post about computer & dust, or my post about the right Computer case). They need their data to be retrieved! In the best case scenarios the disk has only bad sectors and the data can be easily copied to a new media, but in most of the cases, the disk suffered a malfunction, needing special recovery techniques. In some cases, I even need to send the disk to a Data Recovery specialist, which may or may not, for a substantial fee, recover the very important needed files.

 So, what is a minimal backup? A minimal backup scheme is; storing your data on two or more electronic media, the copy or copies should not be used for any purposes other than accessing it in case of need. That is all, that’s Ilan first & only important rule, that’s the secret, always have one or more up to date copy of your data, anywhere possible.

Where can we store those important copies?

  • Backup Cassette Tape. I never believed in this kind of backup media, the tape reader isBackup Tape very costly, you are never completely sure that the tape is in good condition. Tape storage condition is very important; you can’t just put it in your attaché-case, any working mobile phone close enough can erase it. Do you know that any backup tape has a maximum rewrite cycle, the manufacturer prints the number of time you can write data on the tape before it is no longer confident that the data will be safe! For your information in most brands the max read/write cycle is between 20 to 50. Has I said in the first sentence I do not like it, having used tapes in the past & having lost data because of it, I do not advise using tapes.
  • Copy the Data to DVD. It is a slow process, the DVD deteriorate very easily. Leave it DVD stackone week on your car dashboard and try to access it! Its life expectancy, in the best of conditions is about 5 years. Give or take 5 years. If you have more than 4.5 GB of data it become problematic to know where is the data you want to retrieve.
  • Copy the Data to a Disk On Key. Better, It does not easily deteriorate but can be very easily lost or deleted by mistake, it might be too small to contain all your Data , especially if you deal with graphical medias like pictures & movies. To have many of them could become bothersome, as in the DVD, hard to find the requisite Data and is not a good price performance solution
  • External USB disks. They do have a better storage capacity, but they are the absolute least reliable.  Just drop one from your table and see for yourself. You have to remember that for the data to be secure it is better to disconnect it between backup cycles.
  • Store it in the computer. Well don’t count on Laptops, they are to easily dropped or lost unless it just contain a copy to show around. The desktop computer is a good choice, but the Hard disk is quite delicate. It has a life expectancy of 3 to 5 years and too much heat can lower that to mere months, so you have to check it often & replace the backup disk every few years, other than that, if done well, it is one of the most cost/performance solutions.
  • Network Attached Disks & Backup dedicated servers. They have a great storage capacity, they are always on so we can version the files & keep deleted files for a long time. They are independent from your computer  so viruses, fault or defects will not impede them. two main Disadvantage with NAS is that it is not protecting you from local geographic catastrophe like fire, Flood or wind. The second problem is that the price is too high for private users or small businesses up to 4 computers.
  • Cloud storage. Cloud storage is the new buzz word. Lately all the major players are trying to lead in this domain; Ashay, drop box, Microsoft & Google to name the main ones, plus a myriad of less known companies like Syncplicity, SugarSync, TeamDrive & others try to get in on this as well. Beware of P2P cloud storage services like SymformCloud Storage that uses your own computer to store other people data, letting you save bigger amount of data but using your internet access bandwidth as well as your local disk.  As soon as you exceed the basic free storage amount (between 2 & 7GB), the price performance ratio become lousy, however there are some operators out there like Justcloud, that offer unlimited storage for very basic fee, time will tell if this business model hold any water. there are more caveats,  don’t forget that you are limited by your upload speed for the daily backup & even more so for the first data transfer. When I uploaded some of my data to Skydrive, it took me 2 days to copy 17GB. Another problem is that most free cloud storage does not give you a private encryption solution. Meaning that your data can be accessed & read by anyone from the storage company, opposed to encrypted data that can be read only if you have the decryption key, in this case the provider point it to you quite firmly that if you lose your password, the data will be lost!!!!

In this last section I recounted all the different medias that can be used to store & protect your data. As you can see, there is not any one solution that give us full guaranty of protecting our pictures, all our business office files & database, or the original very big & heavy Photoshop graphics files or video editing rushes and final takes.

The answer to that is Layered Backup Schemes that uses all or some of the aforementioned platforms. This however I will address in my next week blog.

Posted in Uncategorized | 1 Comment

Computers & Dust

Computer Physical cleaning is not less important than scheduled Defrag or Registry cleaning & maintenance.  The computer is pumping/streaming a lot of air through its innards. The more fans, the more amounts of air passes from the front of the computer

CPU Cooler Fins Dust Clogged

Figure 1 – CPU Cooler Fins Dust Clogged

throughout the GPU & CPU fans and it is channeled out by the exhaust fans. All this air is full of particles of dust that cling to the fans blades, clog the air filters (if you have them) & finally plug up the CPU & GPU cooler fins. The other overheating culpable is bad fans, if your fan makes noises like a tractor, you have to change it ASAP, when it stop making noise it is not because it is OK, it’s because it has stopped functioning altogether  and is no longer pumping air in or out of your computer case.

In Figure 1 you can clearly see that the CPU cooler Fins under the Fan are completely blocked

The design of the air flow in a computer case is very important, overheat, even with today heat trip motherboard embedded sensors, can fatally disable the motherboard components. The thing is that the heat around the CPU rises slowly when the cooling fans become blocked by the accumulating dust. The heat rises by small increment & it takes months & even years for the sensors to be tripped and to finally trigger an alarm or an automatic shut down. When it finally has triggered the alarm it’s too late! The damage has already been done. All the components around the CPU are already cooked and we are beginning to resent the different effects, like inexplicable freezes, the slowing of the computer speed that no amount of tweaking can repair, strange software behavior and so on. One of the only physical traces of this overheating is on the Capacitors. The capacitor is

Dry & Burst CPU Capacitors

Figure 2 – Dry & Burst CPU Capacitors

a round cylinder placed at different places on the motherboard, this cylinder should be hermetically sealed & it encloses an electrolyte liquid that is essential to the good operation of the capacitor. Over the courses of months of overheating this liquid expand & shrink daily until it breaks the seals and begins to evaporate. Once all the electrolyte has evaporated the Capacitor is no longer functional and the computer endures catastrophic failures. Those failure not only cost you money but can even damage or wipeout your data!

In Figure 2 you can see the difference between a good capacitor on the right & two burst one in the middle, the seals are broken & you can even see that the leaking electrolyte has began to corrode the top of the capacitor.

Figure 3 - Dry & Burst GPU Capacitors

Figure 3 – Dry & Burst GPU Capacitors

In Figure 3 you can observer that the damage is not occurring only on the motherboard but can be seen on a graphic card capacitor near the GPU.

In ninety percent of the cases of hardware failure in computer under 4 years of age the culprit is overheating of the components. Fifty percent of the time, Boards components are failing, the other fifty percents are equally divided between Hard Disk heat failure and Power Supply heat or fan failure. When a disk or PSU fails you can always replace them, but when your motherboard fails you have to replace your computer.

What to do about this will be the topic of one of my next Blogs.

In the mid time you can check my blog about the Huntkey T91 Case & bypass the airflow problem & be ready from the beginning.

Posted in Uncategorized | Leave a comment

Importing Pictures

First the mobile gadgets and Cams

Importing pictures from them should be easy. Any appliances that uses a flash card, do not plug it in the computer, it will always be slower than taking the card out of the camera & inserting it in a card reader and do the copy from there. Not all phones uses external data cards so you can use the cable or set it to use WiFi or Blue Tooth. WiFi has a much better data speed, so set it to always automatically backup/copy your data when the phone is connected to your home network or in case of blue tooth in range of you PC.

The Scanner

There are quite a few Scanners out there, at all price range.

Since I am not working as a technical writer in a news paper, I could not check them all. So I had to restrict myself to an affordable scanner for everyday use, which also can give a good result with family media in mind.

I needed to scan all kind of Media;

•             Color pictures.

•             Black & White, including grainy, artisanaly developed pictures.

•             Very old sepia pictures of all sizes & provenance.

•             Negatives, 35mm & bigger ones from 70mm to 100mm.

•             Positives, mostly Agfa based slides.

The only two I find that answered most of those criteria’s are the HP Scanjet G4050 and the Epson Perfection V600 Photo Scanner. The Epson quickly fallen out of my list because it could only scan 4 negatives at a time, and due to its lack of larger negative scanning ability. I need this option to be able to scan very old negatives that are not standards sizes. A scanner with that ability can also digitize all sorts of X-rays negatives up to letter size. It is especially useful for small dentist’s practices.

So the HP G4050 chose me! If you want the overall technical picture you can check the CNET review. I will concentrate myself to my in-depth  usage of this scanner. It has several templates for laying out negatives & slides, sixteen 35mm, one 4×5, four 120mm and four 6-frame negative strips for about 36 exposures. Quite handy and useful if you want to scan a full film in one go.

So all that was on paper and as I said in the beginning, it chose me because it was exactly what I wanted in a neat package and at an affordable price.


After I received it I immediately began scanning in earnest. The first media I scanned was a set of faded AGFA oversized slides, the result was mind blowing!


A s you can see the “newer scanner”  is full size & the color are real.

When setting the scanning option, you  will have to set it for AGFA, High Res, color enhancement and I also set scratch repair & a flurry of other parameters. Let me tell you that the scanning took a very long time but the results, color wise, were perfect. It resurrected the colors in a completely natural fashion. I also used a thin glass plate from an A4 picture frame to flatten the awkward sized negatives with great results.

The HP software is passable, it not always finds all the frames in a template, especially for negatives, the negatives color range is not great, the scanning is quite slow, the interface is bothersome and spotty and it uses a lot of computer resources. In short it is not great, it should have been livable if not for a trimming problem.

After scanning I compared the results with an anterior scanning made a few years ago & spotted a problem.

The trimming of the result was not exactly the trimming a used for the scanning, it was about 5% off to the back side of the scanner. I decided to try scanning other sort of materials & the trimming shift happened only when using transparent material. I decided to try and see if I could further narrow the problem so I searched the web for different TWAIN. After numerous try I did find an extraordinary TWAIN and scanner application that completely bypass the HP drivers. The “Silverfast”software cost about 100 Euro but it is worth every dime. The results, especially on negatives are a lot better, they have more gray range and a much better definition. BUT the trimming problem remained the same! At this stage I called HP support & began a one week back & forth, we tried everything possible including changing the computer & the operating system, it ended when they finally approved to exchange my scanner for a new one.

A week later having exchanged the scanner at my provider I tried to scan negatives and to my astonishment, the same trimming problem occurred. So at this point I was quite sure it was either the hardware or the firmware. I called HP again and asked for full refund. I must say that HP support tried their best to help me but the scanner not being made for graphic hard users, the support personnel was not knowledgeable enough, I even spoken with an engineer to no avail. After another week of support time loss I got partial refund. So for the time being I am stuck with this scanner. It is a pain to set the trimming so that the border of the picture will encompass 10 percent more so I can trim it latter. I have to set it by hand because I cannot rely on the automatic frame finding. It is time consuming & quite annoying.

All this said, if you overlook the trimming problem and use the “Silverfast” TWAIN the result are perfects.

Bottom line;

I have a client dentist, he scan his patient X-Rays with it, and is very happy with the result, the trimming problem does not bother him. I recommend it warmly for this kind of use.

If you only want to scan pictures, it is OK, but there are other scanners out there to check.

If you want to scan negatives and are on time restriction, do not use this scanner.


Posted in Uncategorized | Leave a comment

To Save the Memories

For my first information blog I would like to speak about a very dear subject to us all, Family & Friends Pictures. Videos will be in the next blog.

Since the digital revolution we shoot every little thing that appends in our life, we takes exponentially more photographs than we did with films. It’s quick, free & has virtually endless storage, especially when compared to the maximum 36 exposures there was on the old cameras. For the younger generation please ignore the last sentence.

The casual photos don’t interest me, the dear family photos of our loved relatives & friends, be it children or parents, is what has me worried. I am a very keen family person, and I treasure the memories imbedded in those pixels. This last year I have scanned more than 3000 very old pictures that my mother, at last, gave me in the form of negatives. How to scan will be the one of my future blogs.

The pictures are kept on all the possible media;

  • The original camera memory flash card. It does not easily deteriorate but can be very easily lost or deleted by mistake, is too small to contain all our pictures and to have many of them could become bothersome and is very costly.
  • Copy the pictures to DVD. It is a slow process, the DVD deteriorate very easily. Leave it one week on your car dashboard and try to play it! Its life expectancy, in the best of conditions is about 5 years. Give or take 5 years.
  • Copy the pictures to a Disk On Key. Better, but has the same drawbacks as the original flash memory.
  • External disks. They do have a better storage capacity, but they are the absolute least reliable.  Just drop one from your table and see for yourself.
  • Store it in the computer. Well don’t count on Laptops, they are to easily dropped or lost unless it just contain a copy to show around. The desktop computer is a good choice, but the Hard disk is quite delicate. It has a life expectancy of 3 to 5 years and too much heat can lower that to mere months. In the daily cases I see in my line of work, there is too much need for Data Recovery. Specifically there are too many persons that are coming to me with their dead disk in hand and with sadness in their eyes, telling me that all their family photos are lost and can I do something please? In the best case scenarios the disk has only bad sectors and the data can be easily copied to a new media, but in most of the cases, the disk suffered a malfunction, needing special recovery techniques. In some cases, I even need to send the disk to a Data Recovery specialist, which may or may not, for a substantial fee, recover the beloved memories.
  • Store it online? A very good option for a secondary or tertiary solution, but quite slow & once it’s on the cloud you don’t have complete control any more. Passwords are stolen everyday from very big companies (remember the Playstation fiasco), so, any one may some day, have access to your data.
  • Print it? When you are printing your pictures, in the process of moving from digital to analog, you are automatically loosing quality! So, not a very good scheme not to lose your data.

SO WHAT DO WE DO? Well, I am going to let you in on my first rule;

Always store your valuable data (here read “pictures”) on TWO or MORE different electronically rewriteable media.

How to Do This;

The first step is to move the pictures from your stills camera flash memory to your computer (do not copy), do this every day to once a week if you use it daily, or after every session if you use it only on holidays & at family gatherings. So, move the pictures, as I said, to a dedicated folder on your desktop hard disk. It should be your newest disk on your newest desktop. The pictures must be in order to be able to access them easily & to do a thorough backup. The best way is to index the pictures for better retrieval. I myself organize my folder by years & rename my picture with first the date, when the date format is year-month-day of month, this way it is always chronologically arranged, then the event account & then a running counter with two or three digits depending on the number of pictures, for example;

2011-11-04 Sunday at my mother in law 01

If you do not know how to batch rename your files or are too busy, you can store them in a folder under the same naming scheme without renaming the files themselves.

Now that we can find all our pictures, we arrive at last to the second part of our plan to protect them. we have to choose a secondary storage for backup. The absolute minimum scheme, is to use an external USB disk & to copy the entire chosen folder (the one we store our entire picture collection in) to a dedicated folder. You will have to do that every time you ad pictures to your main depositary folder. You can always automatize the process with a backup application. The external USB disk should never be used for anything other than backup your data, never ever take it to your friends or family for show or use it for anything else, store it in your safe (if you have one) or somewhere safe. Not in a drawer for fear of being banged when opening it. Under this set of circumstances it should last at least 5 years. You should change the disk every 5 years & not take chances.

I myself do not trust only 2 storages. I have four of them!

  • I have my main storage on my main personal computer, always on the newest drive.
  • The second storage is a read only shared folder on my server so the family can see the pictures on any of their computers or on the living room network attached TV, instead of a server you can use a Network Attached Storage as well.
  • The third device is my laptop so I have all my pictures with me always.
  • & the last but not least is an external drive at my mother home that I actualize once a year at Passover. This is in case of a major disaster like fire or a devastating virus attack.

So there you have it, Ilan’s first rule for not losing your pictures (here you can also read “data”).

Next blogs I will review other backup method & other picture sources like phones & scanners.

Posted in Uncategorized | 2 Comments

Hello world of tech using people.

Hi, in my day to day job as an IT specialist I try to help people use the technology that surround us. I have decided to open a blog to help non savvy technology users (that most of us now) in their day to day struggle to make sense of IT & ultimately to use the tech & not be abused by IT.

Warning; English is not my main language & in top of that I am dyslectic, so if you find any mistake feel free to help, you can also find my posts in Hebrew here;

The world around us becomes more complex & difficult with each new gadget, one of my preferred authors coined a word I like very much & that describe quit well our technical surrounding;            Multiplex              (not the theater but the Extremely Complex, maybe it should be ExtraPlex? But I like MultiPlex Better)

The High-tech Gadgets are everywhere in every niche of our daily life, for exemple;

  • Our phones that do everything for us but loose it on the first occasion, when they are stolen for example.
  • Our cars, that will soon phone home to big brother to tell him that we are speeding!
  • Our dogs that are GPS tracked (like our Kids).
  • Our cats that can show us in which trash bin they where today before we came home & petted them.
  • Our shoe that count our steps, pass this data to the phone to be dissected & that tells us that we are too fat, too skinny, too stressed, or too relax, (Coffee dear?).
  •  Our fridge that will become an automatic ordering machine (any coffee left?).
  •  Our computers that 80% of us use only for 20% of its possibility.
  • Our digital cameras that record all of our family most treasured memorabilia & then we lost it because we do not know how to find it or protect it.
  • Our television that now can browse the web or stream our lawfully copied DVD movie from our Desktop or even directly from the cloud.

What is the difference (if any) between ipod, ipack, ipad, iphone, smartphone, dumbphone, dect phone, wirless phone, IP phone, tablet, netbook, laptop, desktop & flattop.

What is the cloud (not the one in the sky), BlueTooth (should I look in the mirror?), Wi-Fi security like WEP or WPA or WPA2, what are those self replicating ports in the back of my PC, HD 1024P, HDMI, Display Port, DVI (sorry, did I began to sound Japanese?). Do we need them & if yes which kind do we need?

So here I am, I will try to help you Juggle with the technology & warn you of the pits. I will try to post one subject per week, but since I am working it may take more than that between blogs posts.

Posted in Uncategorized | Leave a comment