GoKeyVal
FDIC Bank Data API Available, Official Announcement Pending

The U.S. Federal Deposit Insurance Corp. is realigning its online search tools to an API architecture — meaning developers can start accessing data via open API.
The U.S. Federal Deposit Insurance Corp. (FDIC) looks to be reorienting its data architecture toward an API-enabled approach, yet official confirmation is as yet unavailable.
Despite this, a key resource — the FDIC’s BankFind search tool, which lets users confirm whether a specific bank has insured deposits made through its institution — is now accessible via API.
The FDIC is a central government authority that makes up part of the U.S. economy’s robustness and security. By insuring deposits made at banks and addressing risks to deposit insurance finds, the agency is able to provide a level of confidence and security in case a particular bank fails. Its online BankFind tool lets consumers and businesses check whether their own banks have deposit insurance with the FDIC, and therefore that all their own deposits are federally insured as well if anything happens.

It appears that in recent months, the FDIC has been rewriting the underlying architecture of the BankFind tool to enable data to be queried via API.
As yet, no documentation or official announcement has been made; however, it is possible to start querying the BankFind tool by resources, including:
- Searching whether a bank is insured by its name: https://odata.fdic.gov/v1/financial-institution/Bank?$format=json&$inlinecount=allpages&$filter=(substringof(%27BANK%20OF%20AMERICA%27,name))
- Discovering details about the bank, such as its returns on equity, website, quarterly net income and initial date of insurance: https://odata.fdic.gov/v1/financial-institution/Institution?$format=json&$filter=certNumber%20eq%204569
- Identifying bank branch locations: https://odata.fdic.gov/v1/financial-institution/Branch?$format=json&$inlinecount=allpages&$filter=certNumber%20eq%204569
While primarily the API could be used as a bank locator with instant details of banks operating in the U.S. and their branch locations, potentially the API could also be used:
- To integrate with tools that help consumers select which banks to use
- By financial institutions to draw on the additional supply of banking details stored by the FDIC (such as the net income data)
- Programmatically by businesses to risk-assess banking partners for their projects
The approach shows that while APIs are gaining momentum across government departments, at times it is unclear whether new initiatives are drawing on the experience of some of the government’s thought leaders and available resources. For example, the "intrapreneurial" office of the Government Services Administration — 18F— has released a variety of resources to help government agencies document their APIs and alert developer communities to their existence. It would appear that if this is in train from the FDIC, it's not quite there yet and is still in the process of building its API architecture before it starts drawing on government expertise on how to document and promote its open API. In any case, it is available openly for any developer to begin tinkering with it, even if no official documentation or announcement has been made.
ProgrammableWeb has reached out to the FDIC for comment and will provide an update when the open API is announced.
University of Alicante Open Data

New Smart City Global Hackathon Encourages Dev API Use
A new international data hackathon event series aims to encourage developers to use APIs to create smart city applications.

The goal of Global Urban Datafest is to bring developers together with local experts, including journalists, community leaders, urban planners and subject-matter experts, to create teams that can solve some of our cities' greatest challenges.
Two winning teams from each city will be selected to go head to head in a global competition at a future Global Urban Datafest event.
The event will be held in more than 25 cities on the weekends of Feb. 21-22 and March 7-8.
The diverse city hosts include:
- Barcelona, Spain
- Boston
- Dakar, Senegal
- Mexico City
- Sao Paulo, Brazil
- Shenzhen, China
- Vancouver, British Columbia
Civic tech is an emerging field aimed at using technology to create solutions that address the challenges of our growing urban environments. Global populations continue to grow, with forecasts predicting that by 2030, 60% of the world’s population will live in urban areas.
Along with this pressure comes an increasing use of all resources, from oil to food and water to increased use of infrastructure like roads.
Cities are beginning to prepare for these challenges in similar ways to how businesses are adopting Web infrastructure: They are taking a platform approach and reorienting themselves into composable units that can be shared and orchestrated into new solutions.
Opening data pipelines, adding sensors to resources like waste management and public transport infrastructure, and updating business processes like procurement are all ways that local governments are preparing to work with entrepreneurs and to create more internally tech-savvy environments that can respond to the population challenges that are also happening at a time of global climate change. Of course, governments move much slower than business, so in many cases these changes proceed at a near-glacial pace and are impacted by politics at every level.
The Urban Datafest event series aims to help quicken this rhythm and inspire participation between technologists and urban subject-matter experts so that new collaborations can be formed and new ideas brought to the fore.
A Challenges Approach to Competition
Each city is posting a series of challenges for local teams to address as part of the Urban Datafest hackathons.
In the Boston area (Holyoke), the challenge revolves around how to use the Internet of Things to improve the pedestrian experience, while Boston (Somerville) is looking for new ways to crowdsource data from community stakeholders to create new policy interventions.
According to Anna Calveras, one of the organizers in Barcelona, the Urban Datafest hackathon will dovetail with the iCity Platform Challenge (open to anyone), which seeks apps made using the iCity Platform APIs. Barcelona-specific APIs on the iCity Platform draw in data from a variety of sources, including:
- Sentilo infrastructure sensors for the environment, waster container usage, pedestrian flow, parking and irrigation
- Smart citizen sensors for noise, light and other environmental health measures
- Open 311-type citizen complaint records
In Vancouver, competitors will use the Urban Opus Hub APIs to access real-time sensor data from transport and the environment and city open data sources to view, graph and analyze data when creating smart city solutions. Urban Opus is also encouraging developers to try the Node-RED IoT visual programming tool and will bring IoT devices like Arduinos and Raspberry Pi to the event so that participants can experiment.
But Can Individual City Challenges Solve Smart City Problems at Scale?
While the goals of the Urban Datafest are admirable, one of the key stumbling blocks remains: With each city setting its own challenge using a variety of data sources, APIs and device infrastructure, this raises the risk that solutions created in one place cannot be adapted for use in another city. Open API standards are essential to making smart city solutions scalable.
This issue may well come into focus during the subsequent round of the competition series when team winners from each city will compete against each other. At that time, the capacity for teams to demonstrate how they can deploy their solutions in other city environments may create the fertile ground needed to solve one of the greatest challenges in civic tech: the capacity for new entrants to quickly move to a platform approach, drawing on open API standards in order to create lasting, efficient solutions that can be deployed globally.
Participants can find out more at the Global Urban Datafest website. For details of the iCity Platform challenge, which runs until Feb. 28, see the iCity website. Across Feb. 21-22 and March 7-8, interested onlookers can keep up with what is happening by scanning the hashtag #SmartCityHack.
XOData Lets Developers Explore OData APIs Visually

XOData is an API exploration and visualization tool that helps developers understand the entities and resources available in an API using the OData standard.
Open data API exploration and visualization tool XOData aims to make it easier for developers to understand the resources, functions, and parameters available for extraction from datasets created in the OData open data format.
Ram Manohar Tiwari, founder of PragmatiQa and creator of the XOData tool, says the tool is suited for both developers and business users looking to explore existing APIs as well as to easily prototype and test APIs created in the OData format.
OData as an Open Data Standard
OData is an international open data protocol standard aimed at encouraging best practice among open data publishers when building RESTful APIs. OData is the protocol recommended by the Organization for the Advancement of Structured Information Systems (OASIS).
Major libraries and publishers of open datasets often ensure that the source data is available in the OData format.
Since May of last year, for example, OpenDataSoft makes all open data hosted on their open data platform available as a read-only OData endpoint.
Additionally, the international Organization for Cooperation and Economic Development (OECD), the U.S. Court system, and Socrata all make open data available via OData endpoints.
XOData Core Functionalities
XOData provides a number of options to explore OData datasets, explains Tiwari.
Users can type in a metadata endpoint directly into XOData or they can test the tool; a variety of sample OData sets are provided. (In a nod to Microsoft’s legacy as initial creator of OData, sample datasets all use the classic “Northwinds database” name that some developers will remember from Microsoft Access.)
Visually, XOData can display the relationships between the data in an OData API.

Full entity details are available via XOData’s API details functionality.
These functions help developers see what entities, type, and attributes are included in an open dataset and reveal the parameters that can be used for each. This can be useful for open datasets where documentation is minimal, or where Odata API endpoints have been exposed on sites like OpenDataSoft for datasets not originally documented in OData format.
XOData: Visualizing Data from OData Datasets
In addition to allowing developers to explore what is available in an OData API, XOData can also retrieve, analyze, and visualize the data available via the API.
Data can be retrieved through XOData’s Data Explorer query function and shown in an Excel spreadsheet table format. From there, data can be filtered to show only particular ranges or subsets, and can then be displayed visually as graphs and maps. Charts can then be exported as images.


XOData is available for free online access for developers and is one of the recommended tools of the OData oversight organization. A chrome web app is also available for a fee.
Government Agencies Turn to IFTTT to Make APIs Accessible

IFTTT has released terms of use specifically so that U.S. government agencies can create API services on its platform.
The release of a government-focused terms of use (TOU) aims to enable government agencies to make APIs available as "channels" on the IFTTT platform.
IFTTT (which is rebranding its main service as IF) is an API aggregation service that lets users create simple workflows triggered by specific events. For example, users can enable a "recipe" so that whenever a user saves a Web article or blog post to the read-it-later app Pocket, the URL of the article is automatically saved to a Google Drive spreadsheet. Like any software-as-a-service product, IFTTT has terms of use covering things like privacy of a user’s account data and noncommercial clauses to prevent developers from using IFTTT as a feature within their apps without a partnership arrangement with IFTTT.
The Importance of Government-Specific Terms of Use for SaaS and API Providers
New terms of use means that U.S. government agencies can create their own channels that demonstrate to end users and developers what can be achieved by using government APIs and open data. The process is interesting for any API or SaaS provider that hopes to work with any level of government in future.
The main focus of the government terms of use appears to address national regulations for the storing of data by third parties. Creating terms of use specifically for government usage of social media and cloud services has become essential to addressing regulations governing security, privacy and records retention, and to match some specific agency regulatory frameworks. In the U.S., this process has been standardized for all tiers of government: federal, state and local. (A detailed explanation of the process has been published by the DigitalGov website.)
Agencies Using IFTTT to Dogfood Government APIs
Justin Herman, lead architect of the IFTTT ToU, works at the federal General Services Administration (GSA) on government social media strategies. He is hopeful that government agencies can make use of IFTTT channels to create internal efficiencies, in the same way that agencies should be dogfooding their external APIs for internal use. “The first step in trying in making an IFTTT channel useful is to show the business case for either improving public services or reducing the current costs of delivery," he says.
“For example, every single government program that uses social media can save money because we have records management laws that require us to archive all social media records for the general public," Herman says. "Before this IFTTT option, there were limited options for recalling and exporting social media accounts data. Now with IFTTT, we are easily able to set up a Google Calendar that auto-archives every single post as it is done.”
Herman says that this is a significant savings: Some agencies may have 20 suboffices around the country and dozens of subagencies, all with their own social media accounts.
Government Agencies Ask Developers What API Recipes They Want
With the TOU in place, agencies are now seeking the help of developers in understanding what APIs and workflows would be best set up as IFTTT channels.
“There have been government agencies looking at how to use APIs to better deliver public services for years now,” explains Herman. “In the GSA, we have senior API specialists who help other agencies. So there is a robust community across the federal government keenly interested in revolutionizing digital public services. Now, it is a matter of looking at what is the best use of this system.
“Right now, what we are looking at is the hundreds of things we could be doing with IFTTT, both for public services and for how we can use it internally.”
To help promote the use of IFTTT among government agencies, federal intrapreneurial GSA branch 18F has created If Gov Then That.

“This is a multistage process," says Herman. "Agencies are very interested in taking a look at any opportunity to find out what the developer community needs. That is going to trump anything.
“For this initial release, we are asking what are some ways that IFTTT will be most useful for people and really bring out the potential of this,” says Herman. He compares the possibility of using IFTTT channels as similar to the innovation leverage from using weather APIs: “Weather data is government data that is used for a myriad of services. That’s the model we want for services delivered through APIs. People could be using and benefiting from government data without thinking, ‘I am downloading a government app.’ ”
As with IFTTT’s main TOU, use of the service to power commercial software applications is discouraged. However, developers could use IFTTT to better map their proof of concept if considering adding a similar alert or automated feature to their products. Using IFTTT first may help developers better understand the end-user value of providing the feature before digging into the API directly to add the functionality of the API into their products.
The move has already led one of the U.S.’ leading developer advocates of open data, Waldo Jaquith, to tweet:
Open gov't data needs @ifttt channels.
— Waldo Jaquith (@waldojaquith) January 21, 2014
20 Ways Developers Could Use IFTTT Recipes to Trigger Alerts
For urban mobility and travel app developers:
- When severe weather alerts are issued
- When Transportation Security Administration lines at a given airport are longer than 20 minutes
- When commuter bridge inspections are overdue
- When a national park has been closed or when walking trails are temporarily closed
For apps targeting journalists, editors, media agencies and industry analysts:
- When a new data source has been added to data.gov
- When a response to a Freedom of Information Act request has been published
- When a We the People petition has been started or has reached the threshold for requiring a government response
- When new Bureau of Labor Statistics data on forecasts for particular professions are issued
- If census data for a particular data set falls above or below a particular economic indicator threshold
- If a particular lobbyist visits the White House
- If a government agency has created a new social media account
- When a service contracted by government has crossed a threshold for cost or time overrun
For food-related apps:
- When a new farmers market is added
- When a particular type of food or product has been recalled
- If a restaurant inspection declares a venue unsanitary, removes it from a recommendations engine
For financial apps:
- When a tax return has been accepted or a refund has been issued
- When an FDIC-approved bank opens a new location
For medical apps:
- When a hospital service has changed location or hours of operation
- When a medication has been recalled
- When a medical therapeutic device has been recalled or had usage warnings changed
Developers who want to see government APIs enabled on the IFTTT platform are encouraged to share their thoughts. “We want to know people’s feedback on it. What are the types of things you want to see and what do you need to see? What do developers need to make use of Government APIs on IFTTT?” Herman asks.
He says the best way for developers to provide feedback on what APIs they want to have turned into triggered workflows (or recipes) on IFTTT is to add their thoughts to the open thread on GitHub set up by If Gov Then That.
BusinessUSA Success Stories

BusinessUSA Articles

BusinessUSA Data

BusinessUSA Services

BusinessUSA Programs

New Zealand Government Launches API Portal for Businesses

The launch of a new API portal shows how governments around the world are focusing on government API discoverability.
The New Zealand government has launched an API portal to encourage businesses to integrate with government services. New Zealand has one of the more advanced government digital strategies in the world, and the initiative is a strong, early industry example of how governments can communicate the value of APIs to industry.
The launch highlights the emerging challenge for governments to ensure that their APIs are discoverable and usable by businesses and adds to the various models used by governments to enable API discoverability.

The New Zealand government website, Better APIs for Business, is part of a national strategy to make it easier for businesses to grow without government bureaucracy or other barriers slowing them down. It also aligns well with the government’s progressive open government agenda. The agenda includes a cloud-first governmentwide policy and the creation of two digital channels to enable New Zealand citizens to access government services securely online.
The Better APIs for Business portal — hosted by the NZ government’s Business, Innovation and Development Department — includes neat introductions to explain what APIs are and case studies that reference the U.S. and U.K. governments’ digital strategies. Local examples are also shared, including a New Zealand API for digital cultural materials, a case study on how APIs help manage transport borders across the country, and an overview of available APIs to access New Zealand’s geospatial data.
A Government API Marketplace
This week, one of the API team’s architects, Glen Thurston, has called for businesses to share their thoughts on how the government should build an API marketplace that can help businesses discover what APIs are available.
Thurston is more focused now on transactional APIs that not only forward data but also integrate with business so there is a flow back and forth of information in order to carry out a government service. To move the discussion of an API marketplace forward, Thurston asks businesses:
- What would you need to know about a transactional service or dataset to be able to tell if it’s of interest to you? This could be both from a technical or business opportunity perspective.
- In the absence of actual APIs, should we start by listing the type of services agencies currently deliver? This could give an indication of the types of services and data that agencies hold that could be useful for future API development.
Current Challenges in API Discoverability
Two challenges for government API discoverability facing the NZ initiative are the need to overcome departmental silos when communicating the value of APIs and the difficulties in creating a library database that is accessible as entries grow.
Overcoming Department Silos
In initial consultations hosted by the New Zealand government’s API Leadership Group, a priority request was for a single point of access that describes what APIs are available and which ones will be coming in the future. For businesses, APIs are seen as “the government,” not a particular government department, and they do not want to have to navigate through the inner workings of government to find the appropriate agency that may have an API they can use.
Weather APIs are a good example of how this useful data service could be hidden by a government’s API library design. As Justin Herman from the U.S. General Services Administration recently pointed out, most people are unaware that the weather data fueling many apps is from the U.S. government. It is managed by the National Oceanic and Atmospheric Administration. Should a business need to know that in order to find the API documentation? If you wanted to use this API for an emergency-services, first-response-type app, would you be more likely to search for the API on the Federal Emergency Management Agency’s website? If you were adding climate data to an agriculture solution, would you be more likely to look for the API on the Agriculture Department’s website?
A Searchable API Library
At present, the site lists only four API sources in its directory, although two of these are to much larger libraries of possible APIs. This is less comprehensive than ProgrammableWeb’s list of NZ government APIs. Missing from the Better APIs for Business website are some APIs that have pretty clear potential for integration by business, including New Zealand Tourism’s API; New Zealand Post’s multiple APIs, including the Post Locator API; and the Charities Commission API of registered tax-deductible charities.
Unfortunately, the data.govt.nz resource listed in the API directory leads business users to an open data library that demonstrates the difficulties businesses have in finding APIs.

Filtering for data sets that have an API format available reveals 1,953 results, but the site does not let users search within those results by category or terms of use license, leaving businesses to wade through pages of lists like “Cook Islands Airport Polygons” topographical API data sets to try to surface something they may find that suits their potential use case.
At its heart, this is the problem Better APIs for Business hopes to solve.
Initial Industry Reaction
Geoff Leyland, director at planning consultancy agency Incremental, has been using New Zealand government open data for some time now and is impressed by the way government agencies have made data accessible. Key projects he has worked on include using government data to help large freighters identify transport options, for waste companies to calculate contract pricing and for local product cooperatives to optimize route planning.
For Leyland, the hope is that APIs will mean more formats for end users, and more formats means more accessibility. “I think that it’s important that a range of levels of access are freely available so that people can choose whether they want to get their hands dirty or, if they don’t, that there are good tools that get them quickly what they want,” he says.
To date, Leyland has primarily used bulk downloads of data rather than APIs but can see how he could “use an API to get incremental updates to the data, rather than just doing bulk downloads.”
He is optimistic that all government agencies will implement best practices in making APIs accessible, as he has seen firsthand the responsiveness of the Land Information’s (LINZ) approach to publishing open data: “All of my experience has been with data.linz.govt.nz, and really, it’s awesome and it keeps getting better. I’ve talked to LINZ about it, and I just said, 'Do it more.' They’ve made improvements, like having permanent identifiers for addresses (which I haven’t used yet). I guess what I really want is for more data sets to be open.”
Leyland is hopeful that the government’s focus on APIs will help spark new ideas about products and services he can offer his clients. “In the future, I actually expect to be offering APIs based on ‘useful’ views of bulk data," he says. "One of my current toys is a tool that sends alerts when the electricity price in a region crosses a user-set threshold. Currently, price data isn’t available in a particularly nice format, and then once you have it, you have to deal with the bulk data, so it’s a question of offering an API that provides a useful function.”
An Initiative for Other Governments to Watch
The move seeks to overcome one of the biggest challenges that has faced open data in recent years: Many governments felt burned after publishing open data that no one used. The API agenda goes beyond just opening up data sets to better focus on discoverability, business use cases and including APIs that provide programmatic access to government services (transactional APIs). As one of the forerunners in allocating resources to a business team to promote government APIs, the New Zealand government will be closely watched by other state and national governments around the world.
The Better APIs for Business website also hosts a Twitter account and GitHub repo to encourage engagement.
BusinessUSA Tools

City Context Open Data

U.S. Energy Information Agency Enhances APIs Accessiblility

The U.S. Energy Information Administration has adopted cutting edge API uptake strategies to bring their API tools to a wider audience.
The U.S. Energy Information Administration (EIA) has partnered with the U.S. Federal Reserve to make up-to-date energy and economics data automatically available in spreadsheets via their APIs.
This week, the federal agency — which is responsible for independent energy information such as energy production, stocks, demand, imports, exports, and prices data — released a data add-on tool for Microsoft Excel that uses macros to automatically channel live data from EIA’s APIs into analytics spreadsheets.

“Analysts live in Excel, so it was a way to make the API useful to them,” Mark Elbert, Director, Office of Web Management at the EIA told ProgrammableWeb.
“It doesn’t take much to build a use case for providing this type of tool, where analysts can access the most recent data for performing periodic analyses of energy and economic data,” Elbert says. He explains this is the nuts and bolts work of many analysts, and embedding the API in a tool was a way to make it more accessible to a wider audience. “In fact, our administrator used to do as much when he worked for a leading international bank.”
The add-on allows researchers and analysts to browse categories, search for keywords, and download specific time-series and current energy data from the Energy Information Administration’s API as well as financial data from the Federal Reserve Bank of St. Louis’ Economic Data API. This data is imported directly into the spreadsheet, which is then maintained in real-time as new data is made available.
For now, the tool is only available as a Microsoft Excel add-on, and is only available for Windows PC users. “There are two ways you can install the add-on: You can run it as a session of Excel. That means you need to install and run it each time you open Excel. Or you can add it as an add-on, so you can permanently run it in Excel as a macro.”
How APIs are Facilitating a Culture of Innovation Within Government
The release of the tool is clearly marked as beta, and a feedback tab is provided so that users can share their experiences. While feedback comments are moderated, the idea is to encourage development in the open. Elbert confirms the agency wants to be frank about any bugs identified in the tool.
The idea of offering a beta tool like this symbolizes a new entrepreneurial culture that is evident in government agencies. As APIs drive new innovation, a willingness to fail and fail fast in the open is emerging.
Elbert says that while there are no specific policy guidelines around creating beta tools within government agencies, there is a rough dividing line between what sort of strategies are suited to innovation and public beta-testing, and what areas of the EIA’s work require greater risk management.
Elbert continued, “While we might be innovative in terms of how data is displayed or add-on tools, we wouldn’t approach our statistical programs in the same way. That requires a lot of rigor. Our statisticians are very proud of the numbers we release, and we have a strong culture of making sure those numbers are solid. But for the transmission and dissemination of data, we can be more innovative and keep up with a very fast pace of technology.”
A Cutting Edge API Uptake Strategy
While the inspiration for the tool came from the Federal Reserve — which shared the code from their Excel add-on to make the EIA tool possible — the approach models some of the most cutting edge API adoption strategies currently being trialed by private businesses.
Online behavior analytics company SimilarWeb, for example, has released a number of Google Tools that have built-in calls to their APIs. Users can download the templates, register for an API key, and then customize spreadsheets and search tools directly from Google Docs. The technique has been driving API uptake and subscriptions since being introduced in July 2014.
The EIA’s approach is the same strategy. The idea is to make the power of the API accessible as a ready-made tool that can then be customized by the end user so they can create their own value from the data that the API makes available.
Use of the EIA and Federal Reserve’s APIs is free of charge, as are the add-on tools available for download.
Open Standards for Civic Tech APIs Edge Closer to Reality

Thirty-one cities agree to API standards in a new initiative between the FIWARE open API platform and the Connected Smart Cities Network.
The EU-funded open API platform FIWARE has reached an agreement with seven countries to embed its core infrastructure as open API standards for creating new civic tech solutions. The initiative hopes to foster the growth of a new wave of startups focused on smart city technologies.
A new Open & Agile Smart Cities initiative has been signed by seven countries — Belgium, Brazil, Denmark, Finland, Italy, Portugal and Spain — that agree to four key mechanisms for the development of smart city infrastructure. The initiative is a partnership between the the EU-funded FIWARE and the Connected Smart Cities Network.

“Three of the mechanisms are technical and the fourth is the agreed approach to take,” said Martin Brynskov, chair of the Connected Smart Cities Network. “A letter of intent has been signed by the 31 cities that includes a commitment to an approach that is driven by implementation. This initiative is about doing something; ideas will evolve through implementation, and we will see what is a good idea and what is a bad idea.”
In addition to the implementation-driven approach, the three technical mechanisms agreed to by participating cities are:
- To support the deployment of FIWARE’s NGSI API open standard, which proposes a common data model for getting real-time, contextual data about cities
- To share API data models, starting with the CitySDK APIs
- To use the open source platform CKAN to publish open data.
The NGSI API
Juanjo Hierro, coordinator and chief architect of the FIWARE platform, says the agreement to use the NGSI API as the open standard for how cities will provide access to real-time, contextual information is key to the smart civic tech innovation that is being fostered by this initiative.
Hierro explains:
If we talk about a smart city, when we are trying to create smart applications, context means anything that describes what is going on and its current state. For example, the buses, shops, streets and even citizens are all entities with attributes that have values. So a bus is an entity that has location attributes that change during the day, also information about its license plate, its next stop, its route, passenger carrying capacity, the driver assigned to the bus for that day, etc.
So context information is about all of those entities that describe the city and what is going on, and the values that characterize those entities that change over time.
The hope is that if all cities are using the same NGSI data model, they can then open up data they may be collecting from sensors and other civic technologies and make it available in a uniform way for third-party application developers. This creates a scalable model so that entrepreneurs and startups can create a civic tech application that can work across any city signing up to use the NGSI standard.
"NGSI allows access to contextual information about a city at a near-real-time response rate," Hierro says. "So applications may query the current status and location of this information. The idea that this standard API will work in every city is what will allow developers to create an application once and see it used in several cities."
Building an Open Standard Data Model in Partnership with Developers
Under the OASC initiative’s implementation-driven approach, the idea is that the API data model will be co-designed with application developers who are building the next wave of civic technology solutions.
Hierro gives an example of an application that is rating the accessibility of public buildings in a given area. If this data is not already held by a city, the application can crowdsource ratings from end users to score how accessible a building is for those with limited mobility or special needs. The application could then share this data with the city so that it is added as a contextual attribute field in the city’s overall data model:
You do not have to think about entities that may be defined by the city itself; the nice thing about NGSI is the ability to extend the model to add new entities and to add new attributes to existing entities.
In this example, the city may provide info about buildings in the city, which allows the application developer to rely on the model, but also to add a new attribute to each building. So through the NGSI standard, cities are providing a platform that enables developers to rely on information in the first place, and then for those developers to enrich that information, which in turn other applications could use.
We have to think about context as something that is not static or rigid. It is really about providing a platform that allows third parties to enrich the data.
The Implementation-Driven Approach to Setting Standards
With an implementation-driven approach, Hierro is confident that the development of open standards will be more meaningful and will resonate with third-party developers.
The next stage of the work of the OASC will be to identify civic tech solutions being built by startups and small and medium enterprises. These solutions will identify what entity and attribute definitions will need to be added to the NGSI data model.
Hierro explains:
So the next step is the OASC cities will meet with SMEs and startups that were selected through the FIWARE Acceleration program, learn about the applications and try and pick up those that are of interest to the cities and initiate a process where a first set of data models can be adopted based on what these applications require. This is important because once those standard data models are adopted by the city, cities can then offer their citizens the applications that can solve those problems.
Hierro and Brynskov see the process as a back-and-forth flow. In some cases cities may already have some data models in place — for example, how entities for bike rental schemes, waste management collection, public transport infrastructure and cultural landmarks are defined. But the hope is that the priority will be on how applications are identifying the data needs they have in order to create solutions that solve real city problems.
“It is a problem-solving rather than a committee-driven approach,” confirms Hierro. “The cities that are part of this initiative believe that we need to start by looking for applications that solve real problems, and we wish to offer them the data they need to solve the problems they have identified. This will be the first basis for the data models, and they become useful for other applications. We don't want to create standard data models if we don't have a view of what applications can support them.”
To avoid duplication and promote scalable city tech solutions, the OASC will then regularly publish data models that document the standards that are beginning to emerge among a dynamic and engaged civic tech entrepreneurial sector.
While the original NGSI API uses an XML schema, the FIWARE Open API platform provides binding for a RESTful interface that allows access to the data via XML and JSON. Hierro notes that JSON is the most widely used among developers participating in FIWARE’s various accelerator programs.
CitySDK Model Ready to Scale
As an example of a problem-driven approach to designing data models, the OASC has already agreed on the first example, drawing on the recent work of the CitySDK project to identify a common set of API standards. CitySDK has been working across a number of cities to test a common API standard for building mobility, tourism and civic participation apps. The data models this project creates will be more widely disseminated across the 31 participating cities. The idea is that cities will create adaptors between their current data sets using the CitySDK data models so that an NGSI API can access equivalent data on entities in each city, thus rapidly scaling up the potential for startups to create a viable market for their products.
“These data models are already the most mature," says Brynskov. "Transportation in particular has established de facto data models. That will happen for the other areas. Mobility is the frontrunner for shared data models, and CitySDK has some of the primary fields defined for sharing this data, so what we are doing is trying to make it easy for cities to get experience."
CKAN as a Common Open Data Publishing Platform
The final component of the common infrastructure is a commitment to use the open source CKAN project as the foundational repository for publishing open data for each city. While CKAN has traditionally focused on publishing historical data, the governing body Open Knowledge Foundation is looking at ways to publish NGSI data out of the box.
And while CKAN is an open source solution, Brynskov doesn’t see that this will compete directly with the emerging startups providing open data publishing platforms for cities. Startups like the Paris-based OpenDataSoft and Socrata in the U.S. are trying to build viable businesses that provide a commercial data publishing platform for cities. Brynskov doesn’t see the agreed adoption of CKAN as a deterrent impacting those startups, believing that if CKAN is used as the central data store, then for players like OpenDataSoft it becomes about providing a strong value proposition in the interface that they provide to their customers:
CKAN is a platform; you need federation. If the federation principles are open, there is no reason why other platforms can’t link up. They just have their own ways to link up repositories of data. Technology is a no-brainer here. How do we make it easy and appropriate from a city perspective to share data?
If you want to play with cities, you have to support open standards. The DataTank runs themselves on top of CKAN. This is a decision to support that open standard.
Civic Tech to Reduce Inequity and Spur Innovation
Brynskov and Hierro both believe that this new agreement will enable cities to address social inequities faced by marginalized citizens and poorer communities, as well as enable the next wave of innovation and economic development. Brynskov explains:
This is important if you do not have a lot of resources as an individual, community or city. Any city on any level can set this up as they need to. They can take a ton of what other cities have done and tweak it for their local situation. It will not be high cost. You will need to tweak the application solutions to your city level, but that is scalable.
This technology will also help close equity gaps quite cheaply. There are small minorities in all cities who have a particular perspective or need. With these standards you can leverage resources across the globe and really cater to individual needs. It really is a beautiful characteristic of this. If you look at how the global population is evolving, growth will mostly happen in Africa, South America and Asia and will be unstructured, so we really need to have mechanisms that can scale and fit diverse environments. Looking ahead, this is a crucial component.
Hierro sees the initiative as “a significant step forward” for stimulating economic opportunity by creating a new wave of entrepreneurial activity:
The whole notion of a standard that developers can rely on for context of a city will really enable a lot of applications that developers can market. It is a real boost for innovation. This is not just about more efficient services, but about transforming the city into a platform for the development of applications. This notion of offering data and enriching context information will be a central enabler.
A second round of cities is expected to join the initiative in June.
Owler Looks at How APIs Fit into a DaaS Business Model

Jim Fowler, CEO of the new Owler, explains where APIs fit in to their global vision and why data-as-a-service business models need to be more than data aggregators
Serial entrepreneur Jim Fowler is seeking to replicate the success he created with Jigsaw — which was eventually acquired by Salesforce.com for $142 million in 2010 — with his new venture, Owler. The new business wants to collect global company information in a user-friendly format and sell it back to enterprise customers who usually have to invest in the procurement of company data privately.

“Our business has two sides,” explains Fowler. “Owler is completely free. We keep you up to date on your competitor set and can send you competitor reports with crowdsourced data and CEO ratings. Our users give us this data because we give them valuable data in return, but we also sell it at an enterprise level, which is how Jigsaw did it.
“We have a set of APIs out now that are creating feeds out of our data. But our vision is for us to become the data source that everyone comes to. We are in the pretty early stages of where we are going, so at the moment it is really about building more information, we don’t want to spend too much time on the APIs to move that data.”
The service relies on crowdsourcing as the primary data collection mechanism; uses the company’s URL as the base company identifier; and draws on RSS feeds from the U.S. Securities Exchange Commission and similar national government entities around the world (“the Brits are particularly good, they even give revenue data as well,” says Fowler).
Governments Failing to Make Company Data Open via API
Owler’s data collection approach via RSS reveals how far open data APIs still have to come in order to support new business initiatives.
For example, the U.S. Securities and Exchange Commission (SEC), which is the government collector of all publicly-listed company information, does not have an API to allow access to that data. Companies must file their annual financial reports with the SEC and that includes a heap of additional business information that is useful to mine, such as any subsidiary companies, locations of offices, relationships with parent companies and industry classification codes. But while the SEC provides this information publicly, it is done so in eXtensible Business Reporting Language (XBRL) which is not designed for automated use or integration into external databases in the way that APIs enable. So pulling data from SEC into user-friendly formats ends up being done by proprietary data providers like EDGAR Online who funnel SEC data into their financial and business data platform and then provide that data back out as the EDGAR Online API.
Like EDGAR Online, the non-profit, open-to-donations CorpWatchscrapes published SEC company filings and provides an API free-of-charge. Rank and Filed and Last10K also provide APIs to better access SEC company data, with Rank and Filed having been created by a former SEC employee who was so frustrated by the lack of APIs that he spent seven months learning how to code and now makes the API resource available.
While they would prefer to use an API if it was available, Owler instead uses the SEC’s RSS feeds to draw in the information to populate their basic profiles, and then fosters a crowdsourced approach to encourage Owler users to add to the business information datasets and to answer questions around approval ratings of CEOs, and to share their acquisition predictions.
The Data-as-a-Service Business Model
Owler’s entry into the market may create stronger competition amongst several suppliers who already seek to provide business profile information via an API pipeline, including Dun & Bradstreet, EDGAR Online, and CrunchBase. “We want businesses to get out of needing to produce business information data themselves,” confirms Fowler.
According to Himani Jain’s Data-as-a-Service business model typography published by Harvard Business School, Owler is a data fabric layer, where “Players will act as the custodian and aggregator of the data providing controlled access to that data through an API. Part of this offering would be stream processing, where data is analyzed immediately after it has been created.”

ABOVE: Data-as-a-Service business model layers, by Himani Jain (Source:Identifying Value Layer in the Big Data as a Service (DAAS) Business Model)
The Owler business model demonstrates some of the nuances around collating and supplying data that may seem counterintuitive. Having tested his theories with Jigsaw, Fowler firmly believes that “the data business highway is littered with the roadkill of data aggregators. You make money by making your own proprietary data supply.”
Fowler’s take is that while government open data APIs for basic company information are extremely useful for his business, he is not looking to aggregate data supplied from other proprietary sources, who may charge for the data and have caveats on rights to commercialize or on-sell that data to others.
At present, Owler uses social media APIs to collate data on levels of followers of companies, but sees the API play as a later stage, when they are more focused on selling their data to enterprise customers who will need a way to integrate it directly into their business systems.
EIA Discusses Managing Open Innovation

The U.S. Energy Information Administration clarifies its API road map following concerns that new tools are not truly open or transparent.
Concerns following the release of the U.S. Energy Information Administration’s latest API-enabled tool draw attention to the problems government agencies will increasingly face when creating in the open. How the EIA is dealing with the valid criticisms demonstrates the types of engagement that governments at all levels will need to cultivate.
Last week, ProgrammableWeb ran an article on the EIA’s new Excel tool, powered by its API. The tool also enables economics data to be automatically fed into it from the Federal Reserve Bank of St Louis’ API.
Creating API Tools for Proprietary Products
While many were supportive of the new release, it raised some eyebrows as it prioritizes embedding the government-provided API into a commercially proprietary tool — the add-on is only suited for Microsoft Excel users working on Windows operating systems — while not releasing the underlying open source code that created the tool. From a transparency perspective, this makes it difficult for end users and open data advocates to ensure that the tool uses the same API endpoints that are available through the public API.
The approach contrasts with the U.S. Department of Labor, for example. It has apps for Department of Labor statistics that use Labor’s API to update the latest figures for the consumer price index, unemployment rate, productivity indexes and other labor indicators. The department’s API developer portal then has a link to the open source code for the stats app published on GitHub.
Mark Elbert, director of the Office of Web Management at EIA, confirms that the tool is completely built off the API, but that because the source code includes API registration data, it needs to remain private:
The reason it is not open source is not so much the data, it is the methodology, as the tool requires a registration key for APIs. We need to ensure that we don’t have accidentally abusive usage. For example, it would be very easy to be very aggressive with your API calls, say, a million calls a day from a single user. So we have a single point of registration, so if there are load problems, we can speak directly with the API consumer. So there are API registration keys embedded in the file, and that is the primary reason the source code is encrypted.
Elbert confirms: “Yes, the Excel tool is nothing but a wrapper for the public API.” To assuage concerns, EIA will update its download site to “make that very explicit,” he says.
Managing Bulk Usage of a Government API
How government agencies handle bulk downloads of data to avoid overloading API calls will continue to be a key issue, especially for departments like EIA that are releasing a lot of data. The New York City Metropolitan Transportation Authority (MTA) has a similar problem with its subway feeds data. Developers registering for an API key must agree to host MTA’s data feed on their own servers and then have any of their applications that access the data draw on their own server versions and not overload MTA’s data infrastructure. For agency’s like EIA, ways to manage scaling API usage may not be a problem for now and can be solved by reviewing logs of API registered user behavior and reaching out to individual users.
Elbert also acknowledges the concern that the agency has created a tool that embeds the API in an add-on for a Microsoft (commercial) product, but argues that that is where the majority of end users are located. Elbert repeated the comment he made in his original interview with ProgrammableWeb, that the bulk of analysts for which the tool is intended live in Excel spreadsheets every day. As such, “the reception we have had from analysts is almost ecstatic. A lot of these users don’t have a programming team, and in the past they have been taking our data and compiling time series into their own spreadsheets. This automates that process for them.”
Elbert is also hopeful the tool will be useful for other government departments and will reduce government administrative costs. The Office of Energy Efficiency and Renewable Energy uses the EIA’s API for forecasting and visualizations, while the Congressional Research Service is one of the biggest data requesters that could possibly benefit from immediate access to the latest data and time series via the API and spreadsheets.
Based on ProgrammableWeb’s article, Elbert confirms the EIA is now also looking into creating a free Google Sheets version of the tool, and he confirms that the JavaScript code for this tool will be made open source when it is created. "We are hoping to do a version for Google Sheets and that is very exciting. That is much more up our ally. We are a Web shop, so that will really play to our core strengths. It will be open source JavaScript. The Google Docs will be more transparent," Elbert says.
Also on the agenda are new EIA widgets powered by the API, and as these are created, the underlying source code will be made freely available. EIA has also started consuming its own APIs, which Elbert sees as being a cornerstone to ensuring high usability of the new data products.
External Developer User Experiences
External developers using EIA’s APIs are reportedly fairly happy with the current API road map from EIA but are facing similar problems, as is evident with the Excel tool, and have also seen some time lags with the API.
Matthew Brigida, associate professor of finance at Clarion University of Pennsylvania, has created an R wrapper for the EIA API and uses it regularly in his classwork with students, predominantly so they can write their energy economic analyses in a reproducible manner, without storing data locally but instead “pulling up-to-date data through the API.”
Brigida encounters the same issue with opening source code as EIA has:
The API is easy to use — and it supports both XML and JSON. The biggest hindrance is the requirement that the user has an API key from the EIA. The keys are free; however, since they must be kept somewhat secret, it makes it a bit of a hassle to post interactive analyses to the Web. You have to have the code source a private file.
He also notes that “the data sets accessible through the API are stale — they are sometimes a week behind the data posted on their website” but adds, “I understand the EIA is working on fixing this.”
Working in the Open
Adopting entrepreneurial approaches like releasing minimum viable products and working in the open are often counterintuitive to government departments.
Kristin Lyng, who has spearheaded the opening of weather data at MET Norway — now one of the world’s biggest providers of weather data — has talked about the culture of starting to open government data. This often starts with government stakeholders and bureaucrats being uncomfortable with criticism, getting defensive and “even more reluctant to continue on the open data journey.” Lyng advocated among her government peers that often the criticisms were valid and that the comments could actually help government create a better data supply “if they are heard as part of a dialogue with end users.”
Elbert’s management approach at EIA seems to follow a similar cultural attitude, one that API evangelist Kin Lane calls part of the API journey. Lane says that “while we may never get our strategy 100% perfect, we can communicate, and evolve along the way.”
Engaging with end users and developing products that can help communities, businesses and startups make use of APIs is still a fairly novel concept for many government departments. Seeing how the EIA accepts criticism, adds to its road map and continues in the open may give greater confidence to other government departments to follow their lead.
New York State's New Open Data API Increases Government Transparency

New York State is introducing an API that will allow developers to build apps using New York’s Open Government Data for analysis, information, and investigations.
This article is a company-provided press release and ProgrammableWeb cannot vouch for the accuracy of the statements within. If you have questions regarding the information below, please contact the company that issued the press release.
Attorney General Eric T. Schneiderman announced today that his office will be introducing an application-programming interface (API) that will allow application developers to access and use data from its NYOpenGovernment.com database.
NYOpenGovernment.com is an effort by the Attorney General’s office to promote the public’s right to know and monitor governmental decision-making; it is the only statewide resource that aggregates a range of sources for state government information – including data on campaign finance, lobbying, charities, state contracts, member items, corporate registrations, elected officials, and legislation – which is otherwise scattered or difficult to retrieve. The NY Open Government API will allow developers easier access to this data, which they can use in the creation of applications. Currently, the database is used by good government groups, reporters, and law enforcement agencies for analysis, general information, and even investigations.
“Giving the public direct access to this data will help shine a much-needed light on our state government,” Attorney General Schneiderman said. “In an effort to make government more transparent and responsive, we are providing the public access to these tools. It’s long past time we brought transparency into the 21st century, and I look forward to seeing what analysis and applications are developed from the data."
Today’s announcement was made during Sunshine Week, a national initiative to promote a dialogue about the importance of open government and freedom of information. The introduction of an API for NY’s Open Government website will begin with testing by students at NYU’s Center for Urban Science and Progress (CUSP) program. Eventually, the API will be open to the public and all interested developers.
An API is a set of programming instructions and standards for accessing a web-based software application or tool. API’s allow app developers to query databases and build applications that rely on that data.
Currently, the NY Open Government databases can only be accessed through a simple search tool bar. The API will allow developers to build new graphical interfaces, devise algorithms for mining the data in innovative ways, create applications that join the Open Government information with other publicly available data, and a host of other potentially useful approaches. Other government agencies have begun to see the benefit of APIs, including the New York State Senate, New York State’s Open NY, and NYC OpenData. The Metropolitan Transit Authority’s API currently powers numerous apps that make navigating public transportation easier than ever before. Many popular media sites also offer APIs, including Facebook, Twitter, and Amazon.
This initiative is being led for the Attorney General's Office by Special Counsel Simon Brandler, Research Director Lacey Keller, Confidential Assistant Liam Arbetman, Information Technology Specialist Namita Mishra and Information Technology Specialist Kevin Ryan.
BitYota Seeks Out Gov API Opportunities for its Data Warehouse as a Service

Data warehouse-as-a-service startup BitYota is seeking opportunities for its data storage and compute services to make use of open data APIs
Data-warehouse-as-a-service BitYota is using APIs to make data analytics in the cloud scalable and elastic. Already working with marketing customers who need to draw in semi-structured data from multiple source streams in order to create business intelligence, they now have their sights set on city and government use cases where streaming in multiple data sources are also central to performing any sort of meaningful analysis.

“The genesis of our technology is performance and scalability of a database engine, but with the horizontal scale out of a Hadoop-like system,” says CEO and Founder, Dev Patel.
“Semi-structured data is usually in JSON, XML, or key-value formats. We are able to ingest that as first class objects of the data stream directly into our database. Analysts using BitYota do not need to learn any new tools: they can use SQL directly over these semi-structured data sets. And therefore, you get to analytics quickly. As soon as your data is loaded, you are able to run analytics.”
Initial Use Cases: A Marketer’s 360 View
BitYota’s website confirms the type of use cases Patel also talked to ProgrammableWeb about as being the most common: for marketers to get a 360 view of a customer.
Current customers are using their application APIs to feed in a copy of their usage data from their web and mobile applications into BitYota’s data warehouse storage layer, and are integrating that with external APIs that stream in external data from sources like web analytics, social media, and loyalty programs. Customers can then use another set of APIs to feed this data back into their business analysis and SQL tools-of-choice to analyze this data for user value, and churn. Patel confirms marketers working across a number of industry verticals are using this 360-view analytics capability.
“We take all this data in different forms, and bring it into the database. Once the data is loaded in BitYota’s storage layer, then our compute layer is turned on at the time of analysis. The user does the analysis in a compute engine, and when completed, the user shuts down the compute instance. So the compute instance is charged only once.” Pricing schemes divide the storage and compute layer, so that customers can select whether to use Amazon Web Services or Microsoft Azure for storage. Data can be added via the user’s API to the storage as needed, and then only run through compute as analytics are needed. A free trial and developer account is possible.
Because the data is fed into BitYota’s storage layer as needed from either APIs directly or from the customer’s cloud data storage, there is no data lock-in for end users. “Our ability to separate storage and compute gives a lot of flexibility to users for only paying when you are computing,” confirms Patel.
When creating the 360 view, each API’s underlying dataset needs a join-key that connects each data row with its equivalent record in another dataset (for example, knowing which mobile app user should be connected with which Twitter account). “Join keys can vary, for example, social media authentication keys. Join keys are dependent on the analysis, and we provide a platform on which they can do it. The join-key could be activity, time, user ID, location. But the 360 view of the consumer is definitely a very powerful use case,” says Patel, who counts SumAll as one of their customers.
Seeking Out Government, City and Public Data Use Cases
Now, Patel hopes that BitYota can work with government agencies, city authorities, utilities and other public and open data publishers to offer them the warehouse-as-a-service capabilities for both internal departments and their emerging ecosystem of potential external data and API users.
Patel explains:
One of the things we are seeing is that there are all sorts of hidden assets of data in government agencies, so just putting data out there via API isn’t the answer. Analysts may have SQL skills but are not necessarily skilled at writing the code for pulling data from an API.
Patel sees a natural alliance with open data publishers like Socrata, OpenDataSoft, and governments using the CKAN platform:
One of the themes here is to partner with organizations and have all of this data available in a data warehouse: the data is available, laid out in an appropriate way, available in a scalable architecture (we have separated storage and compute so storage can scale, and then independently add compute and scale that up and down).
The advantage is to pull all of this data from government systems (census data, realtime traffic feeds, crime data), then all of is available in a cost-effective way and people just pay when they want to analyze the data.
Such data and analysis could be instrumental in city planning. Some potential use cases could be:
- A logistics company could optimize their food distribution deliveries across a city or region, factoring in locations of supermarkets, traffic congestion realtime or historical data
- Similarly food co-ops or community groups could use a mix of data to advocate for strategies to address local food insecurity and lobby for community garden plots and local farmers markets or a local produce store by looking at socio-economic demographics from census, public transport routes, current availability of supermarkets within walking distance, levels of crime and safety, etc.
- A planning department or company could analyze the amount of car parking spaces needed in new office buildings or identify optimal locations for new major infrastructure like sporting facilities etc based on population projections, current transport modalities and realtime feeds, available space, etc
- Community and resident groups could use the data to advocate for reducing liquor licenses in their area, or to argue for new medical centers in their neighborhood or for a library, sports venue, childcare services etc by drawing in census, air quality, existing business locations, public transport, traffic congestion and crime and safety data.
Patel is excited by the opportunities BitYota could bring to the sector. “We are working on some proposals on this,” he says, but acknowledges government and city data is just “one of several markets” that BitYota is targeting at the moment. “We focus on opportunities where analysis of data from a multiple streams is critical. This is one area where multiples streams of data are available and necessary for analytics.”
The difficulty will be the lead time that cities and governments currently take when considering new IT services or for innovating on processes. While there are signs of open innovation in government, the prevailing culture is much more monolithic, with convoluted procurement processes and difficulties for new entrants to explain and demonstrate their business services to government customers. This is often a deterrent for many cloud-based businesses that are seeking to leverage the new distributed, application architecture that is emerging. Why pursue civic opportunities when marketing, finance and gaming are ready to exploit new technologies? How BitYota generates civic opportunities for their product and how they are accepted within government should be a watching brief for any startup hoping to target this market.