The OSM map of Mumbai is “b0rked”. Due to some problem with the initial import, all the streets are slightly out of alignment. After a lot of experimentation by all concerned, it was found that this cannot be corrected programmatically, and can only be done manually. I organised a sprint in Mumbai to start off the work on this. Some work was done, but not followed up. The upshot is that Mumbai is the only city in India with an inaccurate OSM map. It is accurate in parts, but by and large, unreliable.
Recently, Wikipedia used the OSM map of Mumbai to illustrate the terrorists’ points of attack. I pointed out to the Mumbai LUG that it is not a good thing to have an inaccurate map. Many people were of the view that open maps like this only help terrorists and should be banned! So the question is: are openness and security mutually incompatible? Often one sees the view expressed: if your source code is visible to crackers, how can you protect your software against them? The best security is secrecy—if they cannot see the code, it is all the more difficult to crack it.
Looking at the Mumbai attacks, the terrorists had all the information they needed—the security forces were the people who were hampered by lack of information. If they had had instant access to the floor plans of the places under attack—with digital maps and helmet mounted devices to show where they were—the loss of lives could have been much less and the operation more efficient.
The problem with keeping information secret is that when it is needed, the process of getting the information is time consuming. Terrorists attack at night when the guy with the password for the secret maps is asleep. Second, if the maps are secret, there is no way of finding out if they are accurate. This goes for proprietary code also—since no one can see the code, there is the possibility of a large number of undetected vulnerabilities and back-doors that crackers can exploit. Yes, open source code also has vulnerabilities, but since the code is open, these show up very soon and can be rectified.
Open source software is so successful because the developer employs the end user as a partner. The result is that hundreds of thousands of people are involved in developing, testing and patching open source software. And the same thing happens in open content sites like Wikipedia and OSM. Yes, terrorists will use the maps, but at the same time, when disaster strikes—be it a flood, a tsunami or an earthquake—accurate, and instantly accessible maps will save a huge number of lives. Apart from major disasters, in case of individual emergencies like heart attacks and accidents too, accurate maps enable efficient routing, which saves precious minutes of ambulance time.
Whether it is a server, an application or protection against terrorist acts, security is a process that involves both the developer/sys admin/the authorities on one hand, with the end users/citizens on the other hand. Where the code/maps are open, and both sides are partners in closing loopholes and developing the system, the system becomes more secure. On the other hand, if everything is kept secret and closed, the citizens are kept out of the loop. So in times of emergency, panic and confusion reign and the magnitude of the disaster increases exponentially.
We should learn a lesson from the US government—they have a rule that anything developed with public money should be put in the public domain. They have given the world the GPS system, they have released the CIA maps for public use, and the US Army has released world-class 3D CAD software (BRL-CAD) and GIS software (GRASS). No doubt there is a knee-jerk reaction demanding that information helpful to terrorists be classified. But the people who are going to suffer if the information is classified are the public, not the terrorists.
Openness and security are two sides of the same coin—one cannot exist without the other.