Smart City. Big Data and Internet of Things
Reading Time: 3 minutes

As governments worldwide aim to become smarter like Dubai, Barcelona or Singapore, their ultimate goal is to become percipient, astute, shrewd, and quick. That means that smart city governments need to use emerging technologies like Big Data and Internet of Things to be able to understand their citizens’ affinities in real-time and to use this ability to assess, make decisions, and react either in milliseconds, seconds, minutes, or days and months. The latency of decisions usually depends on the scale of the problem such as an anomaly, hazardous event, or a city wide disaster. To illustrate, if there is a pipeline leakage, modern smart city platforms can detect the threat and automatically shut down the gas supply in this area. These types of analytics applications are latency-sensitive and need to be taken in milliseconds.

With all the advanced technologies that can help build Big Data and Internet of Things platforms for a smart city, we still face challenges to make these platforms realize their full potential. From my experience with clients working on smart city plans, privacy and security are not the only government concerns, interoperability is a main issue. That’s why our ability to unlock data in a way that governmental agencies are able to move it from one system to another in a safe, secure, and confidential manner is key. When we discuss data in a smart city, we are talking about semi-structured data from sensors, unstructured data from social media or images/videos collected from video surveillance, or other shared open data like bus time schedules or city maps. In order to reach full interoperability, we need a joint strategy for both, technical and organizational interoperability to be connected and collaborative respectively. Organizational interoperability contains aspects like organizational strategies and policies, laws, business processes, cost, and collaborative work while technical interoperability deals with data lineage, semantics, and infrastructure.

That’s why we are witnessing governments appointing chief data scientists like DJ Patil in the White House. National chief data scientists can act as instigators to push this full interoperability across the government and work on both an inward and outward collaboration across governmental agencies and external data partners and citizens. Another interesting opportunity lies in adopting technologies like data virtualization which can help integrate data without moving it from its physical storage. Data virtualization is suitable in such public sector use cases since it tackles regulatory constraints, departmental politics and traditional data governance methodologies that might limit the ability to access data within the source databases. Data virtualization would also help when there is a mounting pressure from the business for a quick-win and cost-effective solution. Quick successful stories can help governmental agencies realize the potential of data in their departments. Also, as discussed above, since data is coming from diverse sources, data virtualization could act as the top layer just below the data visualization and application layer, which will eventually help data scientists have a unique source to access, visualize and tell stories based on the data flowing within the big data platform of the smart city.

If we remember the famous flintstone knives which were used in the daily life of a Stone Age caveman, data is a tool like any tool that acts as a means to an end. Successful companies and governments in the 21st century are using such tools to build data-driven products. That’s why we need to see how our own culture could tailor this tool to be used collaboratively in building a smarter governance ecosystem.

What do you think about other challenges facing governments planning their smart city initiatives? Feel free to share your thoughts and comments below.

Ali Rebaie