Tag Archives: smart city

Citizen-Centric Smart City – What does it take?

Even as the ink was drying on my previous blog about Blockchain in the context of Smart Cities, I had the opportunity of attending a workshop on the evolving IoT landscape and the practical implementation opportunities and challenges associated with it, to deliver an enhanced city experience.

This got me thinking – while there are multiple “new-age technology initiatives” that hold a lot of potential, how does one narrow down to the right technology that must be adopted. We are all aware that cities do not have an unlimited monetary supply for them to be wishful in their approach to exceptional service delivery; on the contrary, most cities today are struggling with funds available at their disposal. This calls for due diligence and deep thinking from a customer perspective – what does a good city service mean for your citizens/residents? A city resident anywhere in the world has minimal expectations that must be delivered against and once this is achieved, the city executives can drive further initiatives to deliver a signature city experience that sets their city apart from the rest. The focus of this blog will be on the former – what does a citizen/resident expect from a Smart city service like public transport, utility payment, land registration, public safety and security, traffic management etc?

The following graphic represents the 3 key attributes of a Smart City Service and the citizen/resident’s expectations against each of them:


From the perspective of the City Administrators, these statements represent the voice of their customer. So, this provides direction for them to shape their service delivery in alignment with citizen expectations. While each city is different in the way its city agencies/departments are structured, the leaders within a city administration needs to put their thinking hats on and figure out what do they need to deliver against these citizen expectations. The following extension to the graphic represents this at a broad level. The city administration should delve deeper and evolve their master plans against them.


To achieve these capabilities, there are multiple measures that could be taken – Operational optimization, Organization restructuring, Performance monitoring, Peer Benchmarking, Citizen Engagement etc. The one initiative that will have the highest impact (in cohesion with the above measures) is adoption of relevant technology. City administrations are not new to technology and most of them have already adopted technology in some form or the other. However, these technologies are predominantly inward-focused (easing operations, publishing reports, regulatory obligations etc.). Today’s world demands city administrators to look at technology through an entirely different lens that has a strong emphasis on customer expectations. This requires some thought on how technologies can be leveraged to deliver a positive city experience to its citizens/residents. It is in this regard that the relevance of new-age tech initiatives comes to the fore. The graphic below extends the story further and details my point of view on the technology initiatives that could be embraced. It needs to be noted that these are not a replacement for existing systems but need to be used as a complement to leverage the true potential.


Reliability is established on one simple premise – having access to the right information at the right time. The mapping of IoT (Internet of Things) as the technology of choice for this service parameter is based on the same principle. For example –  Mr.A wants to travel from his home to a Convention Centre to speak at a conference being held there. Considering the importance of him being there on time, he wants to know the best way to get there – drive down vs hire an Uber vs take the public transport. Considering that he has travelled extensively on this route, he is concerned based on his past experiences. Most often than not, he has experienced heavy traffic on this route. Recently, he struggled to find parking for his car and had to eventually park far from the conference center and walk back all the way. During another instance, on his way back from the Conference Centre, there were unexpected rains that lashed and his favorite car bore the brunt.

So, his expectations of a reliable travel from home to the conference center is dependent on him receiving the right information about the real-time traffic situation, parking availability in and around the convention center and weather forecast for the day. This can reliably be achieved through deployment of sensors across the city and then feeding the data generated by these sensors into an IoT platform operated by the city administration. Further, the IoT platform can draw correlations and run prediction algorithms (needs analytical capability) that will eventually provide contextual information to Mr.A to plan his travel.

City administrations also need to establish a level of transparency that builds trust and has the citizen appreciate the efforts being put in by their city to make lives easier. Today, every city agency has an IT system(s) that is used to record all the operational activities (meter reading, bill payments, maintenance schedules etc.) that the respective agency is responsible for. In a few cases, such information could be recorded on paper or a rudimentary spreadsheet. However, these records are not available beyond the boundary of the owner city agency and this results in lack of visibility to generate a city-wide operational view – a Common Operating Picture. This is where an initial version of an Open Data Platform needs to established to drive exchange of Government Data between agencies. This needs to be supplemented with reliable recording of cross-domain operational activities on a common ledger that can be trusted and accessed by every party based on their access permissions. This is where Blockchain comes to the fore. Considering the case of Mr.A who travels regularly between his home and the convention center, he wants to know of the work that the city is doing to make his travel easier. So, if city agencies could extend the open data platform to its citizens (sanitized to ensure that sensitive information is not being shared) so that Mr.A can also gather a true and transparent view of the relevant work being done by his city administrators.

Further – Considering that the city is a huge ecosystem, we do not expect that there will be absolutely no failures during operations. What irks most citizens is that it is a huge challenge to identify where the fault lies and they are left running from pillar to post to identify the root cause and plug it. This is when the call for accountability needs to catch the attention of city administrations. Blockchain’s common ledger lends beautifully to this requirement. For example – Mr.A has come across a huge pothole on the roads following 2 weeks of cable laying works. As a responsible citizen, he reports this to the city’s single-window operator. This warrants a deeper investigation and thanks to an existing Blockchain ledger, it is observed that the Contractors responsible for cable laying work have captured the proof of their completed maintenance activities where they mention that the Roads Agency has been informed to complete the road repair as per the contractual clauses in the Smart Contract. This helps narrow down the deviation in service to the right agency and fix this accordingly. There is no longer a problem of each agency having a different view of the truth about what might have happened. Combine this with the transparency that was established through the Open Data platform and the reliability of the data coming from the instrumentation across the city, and your citizen is bound to have a happy city experience.

Remember, the best if yet to come.

I’ve opened my Government data – What next?

In my previous blog, I published my views about how governments can (and should) conduct due diligence in identifying the relevant datasets from their blackbox databases and open them up. This included identifying attributes of open data and mapping the data to a 5-star rating. Having touched on the significant potential that Open Government Data holds in the evolution of the Smart City ecosystem, I paused my thoughts with a question –

Is the Open Government Data story complete once government entities have made data available in 5-star format (the best possible format)? Or would you say the story has only started on a strong footing?

The answer to that question is fairly obvious to anyone who understands the Supply-Demand equation of any transaction – . However, what is important to note in the Open Government Data context is that the demand side of the equation involves substantial dynamics. There are 2 very critical aspects of demand – drive consumption and, more importantly, drive value-generation from open government data. The illustration below captures this journey succinctly.

 Three stages of Open Govt Data journey

The rest of this blog will detail what it takes to drive the consumption story for open government data.

Driving Data ConsumptionOpen Govt data journey - Stage 2

In a simplistic view, all this requires is for government entities to make a few commitments and honor them at all times.

  • Commitment to provide fresh data at all times
  • Commitment to bind the data service with a Service Level Agreement (SLA)
  • Commitment to maintain data quality at all times
  • Commitment to ensure data anonymity

The key to data consumption is to provide the Data Consumer sufficient evidence to instill trust in the relationship. As is observed with any relationship, an honored commitment is the best way to drive this. Hence, it becomes essential for Data Provider agencies to step lock with consumers at all times. One of the key concerns most consumers have is the governments usually are high-handed and set the rules of the game. Here is a scenario – A flourishing start-up has built a rich mobile app and open API for a service that brings together datasets from 3 different government agencies and combines that with data gathered from 2 other private entities. The app sources government data from an open data portal hosted by the government. The app has been in the market for about a year and has seen a good uptake because of the uniqueness of the service it offers. The start-up has been making healthy revenues through the mobile app and the open API that renders this service. One of the government agencies has done an internal study and has put in a regulation which restricts the contours of data that is shared outside the government. Following this, what if the agency decides that –

  • A certain dataset that was being used by the start-up will not be made available from the next quarter
  • The dataset refresh will be done only once every quarter instead of monthly
  • The nature/quality of data shared will change from the next refresh onwards
  • The dataset access will be blocked completely with immediate effect

The flourishing start-up will have no choice but to rework their innovative service around these new changes, provided that is feasible and practical. Unlike a B2B relationship where both parties have almost equal say, a G2B relationship is steered by one party – the Government.

It is to be recognized that the lack of transparency has an adverse impact on the public trust in the objectives and motives of the government. Open Data consumption is not a one-time task but a continual process that requires objective commitment levels from the data source entity over an extended period of time to gain the confidence of data consumers. It is time Governments get the balance right in the G2B relationship – as an example, they should come out with clear SLAs that govern the relationship. This is a common practice in any B2B and B2C relationships.

Another area that the governments need to work on is to influence and create the perception that they are doing enough to protect individuals’ rights to privacy and confidentiality of the data held by them. The last thing a data consumer would want is to be entangled in legal issues because the data was not anonymized1 or pseudonymised2 adequately/accurately at the source. Governments should be able to confidently state that the data is anonymized to an extent that it rules out any chances of a reconstruction through the Mosaic Effect3.

Generating value from Open DataOpen Govt data journey - Stage 3

Once the trust between the Data Source entities and Data consumers has been established, it is mostly up to the data consumers to tap into the data and generate value that was unseen for various reasons. More often, the value generation comes from the fact that the data consumers are able to correlate various datasets – government data, private data, and proprietary data – and render use cases that wouldn’t be possible otherwise. Having said that, the governments can still play a substantial role to positively influence the larger ecosystem.

As an example, in one of my earlier posts I had mentioned about the 5-star rating by Tim Berners Lee. One of the key aspects of open government data – ranging from 1-star to 5-star – is that the data consumer agency should be able to further license the data without restrictions on use as part of the public domain. Public data should be released such that it enables free re-use, including commercial re-use. The possibility to distribute data without restrictions will encourage consumption and generate new avenues in the city ecosystem to leverage the intrinsic value of open government data. This will spur further innovation.

Another way that Governments can play an active role in encouraging the community to innovate based on open data is by ensuring that datasets of real value are being made available. While the government may have thought through the data that can be opened up, it is only at the consumption stage that the lacunae in the nature or quality of data becomes apparent. Governments should establish a mechanism by which the consumers can submit their concerns about existing datasets or place requests for more relevant datasets. The government will be able to know the pulse of the consumer community only when such a closed loop exists. At the end of the day, the value of open government data is only realized when then data consumers can generate experiences (through mobile apps, open APIs et al) that enhance the living experience of the residents.

In conclusion, Governments have an active role to play all through the Open Data journey – from data identification to value-generation. Most governments consider their job done once the data is made available on the Open Data platform. As mentioned above, that is just half the job and will serve a minimal purpose without a focus on data consumption side. With large initiatives of this nature, it is essential to keep receiving encouraging signs for the government entities to stay engaged and for the Open Data initiative to sustain over a long tenure. Hence, it becomes essential to ensure that the data consumers are also constantly engaged and their expectations are reasonably met. The need is to establish an ecosystem where all stakeholders participate and play their role towards delivering an enhanced living experience.

1Anonymised Data – Data relating to a specific individual where the identifiers have been removed to prevent identification of that individual.
 2Pseudonymised Data – Data relating to a specific individual where the identifiers have been replaced by artificial identifiers to prevent identification of the individual.
 3Mosaic Effect – The process of combining anonymized data with auxiliary data in order to reconstruct identifiers linking data to the individual it relates to.


The most significant outcome of a smart city (and the key indicator) is to provide citizens of the city alternatives and opportunities to lead a better life. This could be in the form of efficient and effective public transportation, proactive traffic monitoring and easing, automated monitoring of utility services, weather management, emergency management, public safety and more importantly an amalgam of these services through correlations. Each of these Smart City services (and please note that the above list is not exhaustive) is data-intensive and results in reams and reams of real-time data, that when leveraged can generate meaningful insights, further driving an enhanced experience for all city stakeholders.

While City agencies and governments worldwide have been spending effort through various initiatives (Ex: Share-PSI) to tap into this data and generate value, they are also limited by the resources (time, money, labor) at their disposal. What if the reams of data generated through the city/government initiatives are made available to private entities and general public, at large. Of course, this needs a careful scrutiny of what data can be shared beyond the boundaries/firewalls of the agencies. However, that should be a small hurdle to overcome considering the immense potential of the data that will be tapped into by these external stakeholders further enhancing the city ecosystem. This needs governments to open up – open up between themselves and open up to external world. This needs Open Government Data.

In an earlier blog, I had highlighted how Government data can be used in different contexts – Government to Government (G2G), Government to Business (G2B) and Government to Citizen (G2C). The progression to Open Government Data needs a methodical approach and ideally takes the following transition path. Open Data graphic - TransitionEach government agency needs to scrutinize its data to identify datasets that is sought by other agencies and identifying non-sensitive datasets that can be opened up between each other. A further level of scrutiny is required to identify the subset of data that can be exposed to non-government city stakeholders (private entities, general public).

However, not all data that can potentially be opened up will be really helpful. Some of the data may be in a very crude form and will not help the data consumers since they cannot leverage this without extensive effort and investment. For example, scanned (anonymized) application forms are of little value until the data is actually digitized through some OCR mechanism or manually. This discourages the consumer (more specifically, the technical community) to tap into the data even if it is made available. During this era of devops and agile, the idea with Open data is to provision datasets that can be easily tapped into and generate value quickly and with ease. So, how does one identify high value data sets – data that is smart by default?

What does Smart Data mean?

While there cannot be a binary method of identifying Smart data, some very detailed parameters have evolved from the discussions at The Open Group. One such discussion has arrived at the following 9 dimensions of quality that should be applied to data:Attributes of Good open dataWhile these 9 quality parameters are important, one needs to look into the specific business requirement and the corresponding datasets to assign weightage factors to each of these parameters suiting the context. It is also to be noted that each parameter will have further level of detail that has to be studied before declaring it be of high quality. For example, is Credibility defined only by the trustworthiness of sources – what if the data has undergone some transformation in the interim before being made available?

Another example – the Processability parameter mentioned above can also be studied further using the 5-star-data definition provided by Tim Berners-Lee. 5-star rating of Open dataMost government agencies will have a mix of these different segments of rated data with a heavy leaning towards one-star and two-star data. While one-star and two-star data is fairly easy to generate, this limits data usage on the consumers’ side, when exposed and made available as Open Data. Generally, there are very few consumers willing to invest and/or competent enough to refine the provider data further to make it more consumable. And hence, the uptake of this kind of data will be low. Provider agencies will need to invest in progressing further on the maturity roadmap – make data non-proprietary, add semantics and link to related data/content. More importantly, they should adopt these new methods for all data generated till date and in the future. As a data provider agency progresses on this maturity roadmap, it will start seeing a corresponding adoption and value-generation from the larger city ecosystem. It is to be noted that the progression towards 5-star data will involve a change in organization practices and culture but once that becomes business-as-usual, the effort required is fairly low compared to the uptake one gets to see on the consumer-end.Effort - Provider vs Consumer

How can governments be smart?

Most governments worldwide have opened up to the idea of Open data and the ones who have not will only delay but eventually get there. The question is no longer whether government agencies will open their data, it is when and how will they open their data. It requires strategic planning by the governments to execute initiatives of this nature and drive collaborative execution of the same across agencies. Substantial focus on adoption enablement to ensure governance and adherence to standards is essential.

Exchanging data between agencies does not come naturally to most government organizations and when they do share data, they rely on very manual or archaic methods – paper-based, phone requests, email requests etc. Initially, the agencies have to move to an operating model where data is made available on a data exchange platform through a single window (Ex: a portal). Data can be requested and procured through the same window – either in real-time or in batch mode depending on the nature of the request. At minimum, this will ease government operations and make them more effective and efficient. Also, it makes life easy for the citizen so that he/she does not have to share the same data multiple times with different agencies.

This is best implemented by encapsulating the data sharing services as APIs since it can potentially foster further innovation within the government ecosystem.

Once the data has been opened up between agencies, it makes it relatively easy to progress to share the non-sensitive data with non-government stakeholders. The API-approach can be leveraged further to encourage innovation in the digital economy.Open Government data - Progression pathThe next level of progression will be to linked open government data (LOGD) and use this as a revenue stream. LOGD can demonstrate value in a wide range of use cases that were not thought of earlier. As an example, imagine the impact of accessing real-time public transport services data (from the Transportation department) to an event in the city (organized by Tourism department) that links up with the weather data (gathered from Meteorological department) and helps the citizen plan their journey.

Governments need to take up planned initiatives to tap into the potential of locked up data. The data needs to be pruned and polished to make it more relevant and ease consumption. This data, once tapped into by the city ecosystem, can be applied in daily-life scenarios that impact the community and thereby, deliver a signature city experience. The possibilities are immense. All that is required is to take the initiative and tap into the value of the new natural resource – data. The sooner the better.

Closing thought – Is the Open Government Data story complete once government entities have made data available in 5-star format (the best possible format)? Or would you say the story has only started on a strong footing? There is a lot more to follow…

The “Smart” in Smart Cities

Having traveled to different countries, I am quite demanding when it comes to my city experience.  I am sure that today’s “well-traveled” urban resident also has equally high (if not higher) expectations from their city. The overall city experience is driven by the city planner’s vision for their city and execution of this vision by various city agencies. In today’s scenario – this vision, more often than not, involves an aspiration to transform into a Smart City. I am penning this blog against the backdrop of a huge awareness for Smart City initiatives in the emerging markets – MEA (Middle-East Africa) and India.

I work in the MEA region that brings together a spectrum of countries that are at different points in their evolution journey and are driving Smart City programs in pockets. I come from India and there has been a recent announcement by the government about developing 100 Smart Cities in 5 years. An obvious observation would be that a resident from Dubai (UAE) has very different expectations from one in Nairobi (Kenya) and a resident in Johannesburg (South Africa) has different expectations from the one in Bangalore (India). However, every city dweller wants one thing in common – a better way of life in the cities that they reside. Everyone likes to be at a place that welcomes him/her and delivers a signature city experience.

So, what makes a city “Smart”?

The City ecosystem is made up of important entities – people, agencies, systems, procedures et al. Smart City initiatives have to be tied to these entities and drive improvements and deliver exceptional experiences. I believe the transformation into a Smarter City has to go through a progression path spread over three waves.

Wave 1 – Foundational Smart City Initiatives

City planners would have a wide range of possible initiatives that they can consider to make their city “Smart”. While taking them up at one go could be overwhelming – not just for the planner but also for the average resident – there are services that the resident expects “bare-minimum” from a Smart City. It is prudent that cities evolve by establishing a strong foundation that can be leveraged and extended further with time. Here is a sample list of these Foundational Smart City initiatives:

Smart City Initiatives

Wave 2 – Advanced Smart City Initiatives

With the number of Smart City programs being executed worldwide, there will always be a demand on city planners to ensure that their city stands out from the crowd. Of course, this can only be done once the foundational setup is in place. A unique experience for the Smart city resident is essential to ensure stickiness and brand appeal. These initiatives build on top of the foundational initiatives and further differentiate the “city experience” Advanced Smart City Initiatives

Wave 3 – Correlation between Initiatives

Having established the Smart City initiatives, a mature Smart city will have to deliver an “one-city” experience to its residents across all interfaces with the residents. This can be achieved by having the data between different initiatives integrated into one data hub and generate correlations between various sets of silo-ed data. An interesting example would be to correlate weather data with water consumption levels to draw patterns on a hot day vs cloudy day scenario and leverage this further to predict water usage in the future. Another example would be adapting traffic management based on incidents happening in the water network (sewage pipe leaks). A representation of this solution is shown here.Data Hub

What makes a city “Smart” is dependent on where you are on the evolution journey. For established cities that want to evolve into Smart Cities, there can never be a standardized journey template since each city will have its unique needs, demands and constraints. For Greenfield cities, like the ones coming up in emerging markets (Dubai Design District, Palava, Lavasa et al), they have an advantage of not being bound by existing systems/infrastructure. They can be innovative and plan their journey so that they can extend and scale over time with an end-objective of delivering a differentiated city experience to their residents.