API Spreadsheets News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

Streaming Data From The Google Sheet JSON API And

I am playing with as I learn how to use my new partner’s service. proxies any API, and uses Server-Sent Event (SSE) to push updates using JSON Patch. I am playing with making a variety of APIs real time using their service, and in my style, I wanted to share the story of what I’m working on, here on the blog. I was making updates to some data in a Google Sheet that I use to drive some data across a couple of my websites, and thought…can I make this spreadsheet streaming using Yes. Yes, I can.

To test out my theory I went and created a basic Google Sheet with two columns, one for product name, and one for price. Simulating a potential product pricing list that maybe I’d want to stream across multiple website, or possibly within client and partner portals. Then I published the Google Sheet to the web, making the data publicly available, so I didn’t have to deal with any sort of authentication–something you will only want to do with publicly available data. I’ll play around with an authenticated edition at some point in the future, showing more secure examples.

Once I made the sheet public I grabbed the unique key for the sheet, which you can find in the URL, and placed into this URL:[sheet key]/od6/public/basic?alt=json. The Google Sheet key takes a little bit to identify in the URL, but it is the long GUID in the URL, which is the longest part of the URL when editing the sheet. Once you put the key in the URL, you can take the URL and paste in the browser–giving you a JSON representation of your sheet, instead of HTML, basically giving you a public API for your Google Sheet. The JSON for Google Sheets is a little verbose and complicated, but once you study a bit it doesn’t take long for it to come into focus, showing eaching of the columns and rows.

Next, I created a account, verified my email, logged in and created a new app. Something that took me about 2 minutes. I take the new URL for my Google Sheet and publish as the target URL in my account. The UI then generates a curl statement for calling the API through the proxy. Before it will work, you will have to replace the second question mark with an ampersand (&), as assumes you do not have any parameters in the URL. Once replaced, you can open up your command line, paste in the command and run. Using Server-Sent Event (SSE) you’ll see the script running, checking for changes. When you make any changes to your Google Sheet, you will see a JSON Patch response returned with any changes in real time. Providing a real-time stream of your Google Sheet which can be displayed in any application.

Next, I’m going to make a simple JavaScript web page that will take the results and render to the page, showing how to navigate the Google Sheets API response structure, as well as the JSON Patch using the JavaScript SDK. All together this took me about 5 minutes to make happen, from creating the Google Sheet, to firing up a new account, and executing the curl command. Sure, you’d still have to make it display anywhere, but it was quicker than I expected to make a Google Sheet real-time. I’ll spend a little more time thinking about the possibilities for using Google Sheets in this way, and publishing some UI examples to Github, providing a forkable use case that anyone can follow when making it all work for them.

Disclosure: is an API Evangelist partner, and sponsors this site.

Using Apis To Enrich The Data You Have In Spreadsheets

404: Not Found

Keeping Things One Dimensional To Go From API To Spreadsheet In One Step

I have been working on the next version of my human services work, which provides a way for cities to make information about organizations, locations, and services available on the web. Part of the feedback from the community around what was missing from the last version, was the number of API calls you needed to make to get a complete representation of a resource, and its sub-resources, as each API response was one dimensional. An example would be that you could get a list of locations, but to get at the list of services you had to make a separate API call. This wasn’t a lapse in API design, it was a result of the schema being born out of a CSV format, and me working to stay true to the original design, and usage of the schema.

In the latest version, I did release a handful of paths that provide a complete representation of each resource and it’s sub-resources. However, I have maintained the original one dimension representation of each resource and sub-resources, allowing me to offer an XML, JSON, as well as CSV representation for each API call. This allows API consumers to pull CSV lists of organizations, locations, services, and their sub-resources like address and phone lists. While not something that would be useful in all API implementations, I feel like the audience for municipal level human services data will benefit significantly being able to go from API to spreadsheet in a single step. All the GET paths for organizations, locations, and services are publicly available by default, not requiring authentication, making CSV data available via a single URL–something anyone can make happen.

While weighing API design decisions as part of my Human Services Data API (HSDA) work I am having to consider not just the technical of how I should be doing this. I am also deeply considering how the API will be put to use, and who will be doing that. While I am thinking about the heavy system to system integration needs of human service providers, as well as the web, mobile, and other applications. I am also thinking about the individual user who might just need a list of the names of organizations, or the addresses of services in a simple CSV format, so that they can work with the data in their most familiar format–the spreadsheet. I am just focusing on the API side of things at the moment, but once I’m done with the latest version I am going to think about some simple linking, and embeddable tooling that allows users to put CSV data from the API to work in a single click using Google Sheets, and Microsoft Excel.

U.S. Energy Information Administration Excel Add-In and Google Add-On

I was looking through a number of federal government API implementations last week in preparation of a talk I did in Washington DC. The result of research like this is always a notebook full of interesting stories to tell about what federal agencies are up to with APIs. Today’s story is out of the U.S. Energy Information Administration (EIA), with their Excel Data Add-In and Google Add-On tooling which allows you to download energy data from EIA’s data API and economic data from the St. Louis Federal Reserve’s Economic Data (FRED) API directly into your spreadsheet(s).

I’m regularly looking out for innovative uses of spreadsheets when it comes to deploying, as well as consuming APIs, because I believe it is the best way we have to turn average business users into API consumers, by piping in data into the environment they are already using each day. EIA’s data API contains 1.6 million energy series, and the St. Louis Federal Reserve’s API contains 240,000 economic series. Making valuable federal agency maintained data available within spreadsheets like this using APIs is something ALL other agencies should be emulating. First, agencies need to be doing public APIs, then they need to make sure they are also investing in spreadsheet tooling like the EIA is.

I’m adding this example of using Microsoft Excel and Google Sheets as an API client for not just federal government, but also for such valuable commerce and energy data, to my APIs and spreadsheets research. I’m also going to be on the hunt for open source solutions for delivering spreadsheet API connectivity like this. There should be a wealth of open source tooling that federal agencies can put to work when it comes to delivering data to spreadsheets, both internally, and externally with private sector partners. In a time where it is easy to get pretty depressed on a daily basis about APIs in the federal government, it makes me happy to find shining examples of APIs being put to work in such meaningful, and useful ways.

First Handful Of Lessons Using My Google Sheet Github Approach

With my recent shift to using Google Sheets as my data backend for my research, and my continued usage of Github as my data project publishing platform, I started pushing out some new API related lessons. I wanted to begin formalizing my schema and process for this new approach to delivering lessons with some simple topics, so I got to work taking my 101, and history of APIs work, and converting them into a multi-step lesson.

Some of my initial 101 API lessons are:

I will keep working those 101 lessons. Editing, polishing, expanding, and as I found out with this revision–removing some elements of APIs that are fading away. While my 101 stories are always working to reach as wide as possible, my wider research is always based in two sides of the API coin, with information about providing APis, while also keep my API consumer hat on, and thinking about the needs of developers and integrators.

Now that I have the 101 lessons under way I wanted to focus on my API life cycle research, and work on creating a set of high level lessons for each of the 80+ stops I track on along a modern API life cycle. So I got to work on the lesson for API definitions, which I think is the most important stop along any API life cycle–one that actually crosses with every other line.

  • Definitions (Website) (Github Repo) (

After kicking off a lesson for my API life cycle that speaks to API providers, I wanted to shift gears at look at things from the API consumer side of things, and kick off a lesson for what I consider to be one of the more important APIs today–Twitter.

Like my life cycle research I will continue creating lessons for each area of my API Stack research, where I am studying the approaches of specific API platforms, and the industries they are serving. Next I will be doing Facebook, Instagram, Reddit, and other APIs that are having a significant impact on our world. I’m looking to create lessons for all the top APIs that have a big brand recognition, and leverage them to help onboard a new wave of API curious folks.

My API industry research all lives as separate data driven Github repositories, using Google Sheets as the central data store. I edit all the stories published across these sites using, but the data behind all my research live in a series of spreadsheets. This model has been extended to my API lessons, and I’ll be shifting my storytelling to leverage more of a structured approach in the future. To help onboard folks with the concept I’ve also created a lesson, about how you create data-driven projects like this:

  • Google Sheets To Github Website (Website) (Github Repo) (Google Sheet) - Walking through how you can use Google Sheets, and a Github Pages site to manage data driven websites.

All of these lessons are works in progress. It is why they run on Github, so that I can incrementally evolve them. An essential part of this is getting feedback from folks on what they’d like to learn. I’m happy to open up and collaborate around any of these lessons using Google Sheets or Github–you just let me know which one is more your jam. I am collaborating with my partner in crime Audrey Watters (@audreywatters) using this format, and I’m finding it to be a great way to not just manage my world, but also create and manage new worlds with other people.

While each of the lessons use the same schema, structure, and process, I’m reserving the right to publish the lessons in different ways, experimenting with different variations in the layout. You’ll notice the Twitter and Google Sheets to Github Website lessons have a Github issues associated with each step, as I’m looking to stimulate conversations about what makes good (or bad) curriculum when it comes to learning about APIs and the platforms I’m building on. When it comes to my API lifecycle and stack work I am a little more opinionated and not looking for as much feedback at such a granular level, but because each lesson does living on Github, folks are still welcome to edit, and share their thoughts.

I have hundreds of lessons that I want to develop. The backlog is overwhelming. Now that I have the schema, base process, and first few stories published, I can just add to my daily workload and publish new stories, and evolve existing ones as I have time. If there are any lessons you’d like to see, either at the 101, provider, or consumer level let me know–feel free to hit me up through any channel. I’m going to be doing these lessons for my clients, either publishing them privately or publicly to Github repositories, and developing API life cycle curriculum in this way. I am also going to develop a paid version of the lesson, which will perform alongside my API industry guides, as simple, yet rich walk throughs of specific API industry concepts–for a small fee, to support what I do. Ok, lots of work ahead, but I’m super stoked to have these first few lessons out the door, even if there is a lot of polishing still to be done.

Spreadsheet To Github For Sample Data CI

I’m needing data for use in human service API implementations. I need sample organizations, locations, and services to round off implementations, making it easier to understand what is possible with an API, when you are playing with one of my demos.

There are a number of features that require there to be data in these systems, and is always more convincing when it has intuitive, recognizable entries, not just test names, or possibly latin filler text. I need a variety of samples, in many different categories, with a complete phone, address, and other specific data points. I also need this across many different APIs, and ideally, on demand when I set up a new demo instance of the human services API.

To accomplish this I wanted to keep things as simple as I can so that non-developer stakeholders could get involved, so I set up a Google spreadsheet with a tab for each type of test data I needed–in this case, it was organizations and locations. Then I created a Github repository, with a Github Pages front-end. After making the spreadsheet public, I pull each worksheet using JavaScript, and write to the Github repository as YAML, using the Github API.

It is kind of a poor man’s way of creating test data, then publishing to Github for use in a variety of continuous integration workflows. I can maintain a rich folder of test data sets for a variety of use cases in spreadsheets, and even invite other folks to help me create and manage the data stored in spreadsheets. Then I can publish to a variety of Github repositories as YAML, and integrated into any workflow, loading test data sets into new APIs, and existing APIs as part of testing, monitoring, or even just to make an API seem convincing.

To support my work I have a spreadsheet published, and two scripts, one for pulling organizations, and the other for pulling locations–both which publish YAML to the _data folder in the repository. I’ll keep playing with ways of publishing test data year, for use across my projects. With each addition, I will try and add a story to this research, to help others understand how it all works. I am hoping that I will eventually develop a pretty robust set of tools for working with test data in APIs, as part of a test data continuous publishing and integration cycle.

Using Google Sheet Templates For Defining API Tests

The Runscope team recently published a post on a pretty cool approach to using Google Sheets for running API tests with multiple variable sets, which I thought is valuable at a couple of levels. They provide a template Google Sheet for anyone to follow, where you can plug in your variable, as well as your Runscope API Key, which allows you to define the dimensions of the tests you wish to push to Runscope via their own API.

The first thing that grabs me about this approach is how Runscope is allowing their customers to define and expand the dimensions of how they test their API using Runscope in a way that will speak to a wider audience, beyond just the usual API developer audience. Doing this in a spreadsheet allows Runscope customers to customize their API tests for exactly the scenarios they need, without Runscope having to customize and respond to each individual customer's needs--providing a nice balance.

The second thing that interests me about their approach is the usage of a Googe Sheet as a template for making API calls, whether you are testing your APIs, or any other scenario an API enables. This type of templating of API calls opens up the API client to a much wider audience, making integration copy and pastable, shareable, collaborative, and something anyone can reverse engineer and learn about the surface area of an API--in this scenario, it just happens to be the surface area of Runscope's API testing API. 

Runscope's approach is alignment with my previous post about sharing data validation examples. A set of assertions could be defined within a spreadsheets and any stakeholder could use the spreadsheet to execute and make sure the assertions are met. This would have huge implications for the average business user to help make sure API contracts are meeting business objectives. I'm considering using this approach to empower cities, counties, and states to test and validate human services API implementations as part of my Open Referral work.

It told John Sheehan, the CEO of Runscope that their approach was pretty creative, and he said that "Google sheets scripts are underrated" and that Google Sheets is the "API client for the everyperson". I agree. I'd like to see more spreadsheet templates like this used across the API life cycle when it comes to design, deployment, management, testing, monitoring, and every other area of API operations. I'd also like to see more spreadsheet templates available for making calls to other common APIs, making APIs accessible to a much wider audience, who are familiar with spreadsheets, and more likely to be closer to the actual problems in which API solutions are designed to solve.

OpenAPI Spec Google Spreadsheet to Github Jekyll Hosted YAML

I have been playing around with different ways of using Google Spreadsheet to drive YAML and JSON data to Jekyll data projects hosted as Github repositories. It is an approach I started playing around with in Washington DC, while I was helping data stewards publish government services as JSON-LD. It is something I've been playing around with lately using to drive D3.js visualizations and even a comic book.

There are couple of things going on here. First, you are managing machine-readable data using Google Spreadsheets, and publishing this data as two separate machine readable formats: JSON and YAML. When these formats are combined with the data capabilities of a Jekyll website hosted on Github Pages, it opens up some pretty interesting possibilities for using data to fuel some pretty fun things. backend needed.

To push this approach forward I wanted to apply to managing OpenAPI Specs that can be used across the API life cycle. I pulled together a spreadsheet template for managing the details I need for an OpenAPI Spec. Then I created a Github repository, forked my previous spreadsheet to YAML project, and modified it to pull data from a couple of worksheets in the Google Doc and publish as both JSON and YAML OpenAPI Specs. 

My OpenAPI Spec Google Sheet to YAML for use in a Jekyll project hosted on Github is just a prototype. The results don't always validate, and I'm playing with different ways to represent and manage the data in the Google Sheet. It is a fun start though! I am going to keep working on it, and probably start a similar project for managing an APIs.json index using Google Sheets. When done right it might provide another way that non-developers can participate in the API design process, and apply OpenAPI Specs to other stops along the API life cycle like with API documentation, SDK generation, or testing and monitoring.

That Thing You All Are Doing With Slack Integrations, @Blockspring Is Doing It With Spreadsheets

One common thing you hear from the growing number of integrations and bots that are leveraging the Slack API, is all about injecting some specific action into the platform and tooling, we are all already using. Startups like Current, who are providing payments within your Slack timeline, use the slogan, "transact where you interact". I began to explore this concept, in what I call API injection, and is something I'm sure I'll be talking about over and over in the future, with the growth in bot, voice, and other API enabled trends I am following.

The concept is simple. You inject a valuable API driven resource, such as payments, knowledge base, images, video, or other, into an existing stream, within an existing platform or tool you are already putting to use. It is not a new concept, it is just seeing popularity when it comes to Slack, but really has been happening for a while wit Twitter Cards, and chatbots who have been around for some time. 

I'm seeing another incarnation of this coming from my friends over at Blockspring, who I've showcased for some time, for bringing valuable API resources to spreadsheet users. Blockspring just released Google Sheets Templates, which enables your spreadsheets to "do useful things", and "get data, use powerful services, and do things you didn't think were possible in a spreadsheet." Instead of Slack being the "existing tool", it is Google Spreadsheet, but the approach is the same -- bring useful API driven resources to users, where they are already existing and working.

As I continue to watch the latest bot evolution, a stark contrast has emerged between the types of bots we've seen between Twitter and Slackbots--a contrast I feel is being defined by the tone and business viability of each platform. Twitterbots are much more whimsical and about stories, poetry, and other information, where Slack is very much productivity, and business focused, making it a prime target for the next wave of startups, and VCs.

I think there is a big opportunity to deliver valuable API resources into the timeline of Slack users. I think there is an even bigger opportunity to deliver valuable API resources into the worksheet of the average business spreadsheet user. The problem is that these opportunities also means there will be a significant amount of crap, noise, and pollution injected into these channels. I'm just hoping there will be a handful of providers who can do this right, and actually bring value to the average Slack, as well as the every day spreadsheet worker -- you know who you are.

When Are We Going To Get A Save As JSON In Our Spreadsheets?

My last rant of the evening, I promise. Then I will shut up and move back to actual work instead of telling stories. I'm working on my Adopta.Agency project, processing a pretty robust spreadsheet of Department of Veterans Affairs expenditures by state. As I'm working to convert yet another spreadsheet to CSV, and then to JSON, and publish to Github, I can't help but think, "where is the save as JSON" in Microsoft or Google Spreadsheets?

I can easily write scripts to help me do this, but I'm trying to keep the process as close to what the average person, who will be adopting a government agency data set, will experience. I could build a tool that they could also use, but I really want to keep the tools required for the work as minimal as possible. 

It would just be easier if Microsoft and Google would get with the program, and give us a built in feature for saving our spreadsheets as JSON.

Project Idea: Codenvy-Like Containerized Spreadsheets

I wrote a story about a company I’m advising for last week called Codenvy, who is delivering modular, scalable, cloud development environments using their web IDE and Docker. I'm currently working my way through the spreadsheet to API, and API to spreadsheet solutions I track on, and it is making me think that someone should deliver a Codenvy-like containerized spreadsheet environment.

With this type of environment you could forever end the emailing of spreadsheets, allowing people to craft spreadsheets, complete with all the data and visualization goodness they desire, and clone, fork, share, and collaborate around these containerized spreadsheets. You could have a library of master planned spreadsheets that your company depends on, and manage user permissions, analytics, and even scalability, performance, and distribution of vital spreadsheets.

Using this platform you could easily spawn spreadsheet driven APis using services like API Spark, and you could come full circle and have spreadsheets that obtain their data directly from valuable API driven resources, to really blow your mind. Personally I'm not a big spreadsheet lover, but I do recognize that it is where a majority of the worlds data resides, and the user interface of choice for many business folk--making it a good candidate for the next generation of containerized API data sources, and clients.

Just another one of my random ideas that accumulate in my Evernote, that I'll never have time to build myself, and hoping you will tackle for me. As I review all of the existing spreadsheet APi solutions available currently, I’m sure I'll have more API centric, spreadsheet ideas bubble over t o share with you.

Using Excel As An API Datasource And An API Client For The Masses

I’ve been tracking on the usage of spreadsheets in conjunction with APIs for several years now. Spreadsheets are everywhere, they are the number one data management tool in the world, and whether API developers like or not, spreadsheets will continue to collide with the API space, as both API providers, and consumers try to get things done using APIs.

APIs are all about getting access to the resources you need, and spreadsheets are being used by both API providers and consumers to accomplish these goals. It makes complete sense to me that business users would be looking for solutions via spreadsheets, as they are one potential doorway to hacking for the average person—writing macros, calculations, and other dynamic features people execute within the spreadsheet.

I know IT would like to think their central SQL, MySQL, Postgres, Oracle and other database are where the valuable data and content assets are stored at a company, but in reality the most valuable data resources are often stored in spreadsheets across an organization. When it comes time to deploying APIs, this is the first place you should look for your datasources, resulting in Microsoft Excel and Google Spreadsheet to API solutions like we’ve seen from API Spark.

I’m seeing spreadsheets used by companies to deploy APIs in some of the following ways:

  • Microsoft Excel - Turning Microsoft Excel spreadsheets directly into APIs. by taking a spreadsheet, and generating an API is the fastest way to go from closed data resource to an API for anyone to access, even without programming experience.
  • Google Spreadsheet - Mounting public and private Google Spreadsheets is an increasingly popular way to publish smaller datasets as APIs. Since Google Spreadsheets is web-based, it becomes very easy to use the Google Spreadsheet API to access any Spreadsheet in a Google account, then generate a web API interface that can allow for reading or writing to a spreadsheet data source via a public, or privately secured API.

Beyond deploying APIs I’m seeing API providers provide some innovative ways for users to connect spreadsheets to their APIs:

  • Spreadsheet as Client - Electronic parts search API Octopart has been providing a bill of materials (BOM) solution via Microsoft Excel, and now Google Spreadsheets for their customers--providing a distributed parts catalog in a spreadsheet, that is kept up to date via public API.
  • Spreadsheet as Cache - I’ve talked with U.S. Census and other data providers about how they provide Microsoft Excel and Google Spreadsheet caches of API driven data, allowing users to browse, search and establish some sort of subset of data, then save as a spreadsheet cache for offline use.
  • Spreadsheet as Catch-All - Spreadsheets aren’t always being used just about data, you can see Twilio storing SMS, NPR using as crowdsourced engine, making spreadsheets into a nice bucket for catching just about anything an API can input or output.

Moving out of the realm of what API providers can do for their API consumers with spreadsheets, and into the world of what API consumers can do for themselves, you start to see endless opportunities for API integration with spreadsheets using reciprocity providers:

  • Zapier - There are five pages of recipes on the popular API reciprocity provider Zapier that allow you to work with Google Docs, and 57 pages that are dealing directly with Google Drive, providing a wealth of tools that non-developers (or developers) can use when connecting common APIs up to Google Spreadsheets.

I’ve seen enough movement in the area of Microsoft Excel and Google Spreadsheets being used with APIs to warrant closer monitoring. To support this I've started publishing most of my research to an API Evangelist spreadsheet research site, which will allow me to better track, curate, tag, and tell stories around spreadsheets and APIs.

As I do with my 60+ API research projects, I will update this site when I have time, publishing anything I've read, written, and companies I think are doing interesting things spreadsheets and APIs. I'm pretty convinced that spreadsheets will be another one of those bridge tools we use to connect where we are going with APIs, with the reality of where the everyday person is, just trying to get their job done.

Disclosure: API Spark is an API Evangelist partner.

Route SMS Messages To Google Spreadsheets Via Twilio API With TwilioSheet

If you follow Twilio blog or Twitter account you can always find a good API story from the API leader. It also makes me happy to see trends I’m seeing from other provider re-enforced by the API heavyweight. This time is providing spreadsheet integration with common API resources, like Twilio SMS.

Twilio has a pretty slick tool they call TwilioSheet that allows you to receive SMS messages in a Google Spreadsheet, and created a pretty nice walkthrough of the entire setup. Providing this type of functionality helps, as Twilio says, "make it easy for developers and non-developers alike to receive SMS messages in a Google Spreadsheet”—emphasis on the non-developers.

Whether we like it or not, the spreadsheet is the #1 database solution in the world, and provide a huge opportunity when it comes to bridging the world of APIs with the wider business landscape. This is something that API reciprocity providers like Zapier have bridging for a while now, and something that API providers like Intuit are looking to bank into their API platforms.

When you see Twilio doing something like providing providing spreadsheet integration for their API platform, you have to stop and consider whether or not it is something that might work for your own API platform. Spreadsheet integration by default with API your driven resources is a great way to expand the reach of any API, bringing these valuable resources within reach of the actual, everyday problem owners.

Exhaust From Crunching Open Data And Trying To Apply Page Rank To Spreadsheets

I stumbled across a very interesting post on pagerank for spreadsheets. The post is a summary of a talk, but provided an interesting look at trying to understand open data at scale. Something I've tried doing several times, including my Adopt A Federal Government Dataset work. Which reminds me of how horribly out of data it all is.

There is a shitload of data stored in Microsoft Excel, Google Spreadsheet and CSV files, and trying to understand where this data is, and what is contained in these little data stores is really hard. This post doesn’t provide the answers, but gives a very interesting look into what goes into trying to understand open data at scale.

The author acknowledges something I find fascinating, that “search for spreadsheet is hard”—damn straight. He plays with different ways for quantifying the data based upon number columns, rows, content, data size and even file formats.

This type of storytelling from the trenches is very important. Every time I work to download, crunch and make sense of, or quantify open data, I try to tell the story in real-time. This way much of the mental exhaust from the process is public, potentially saving someone else some time, or helping them see it through a different lens.

Imagine if someone made the Google, but just for public spreadsheets. Wish I had a clone!

Secure API Deployment From MySQL, JSON and Google Spreadsheets With 3Scale

I'm doing a lot more API deployments from dead simple data sources since I started working in the federal government. As part of these efforts I'm working to put together a simple toolkit that newbies to the API world can use to rapidly deploy APIs as well.

A couple of weeks ago I worked through the simple, open API implementations, and this week I want to show how to secure access to the API by requiring an AppID and AppKey which will allow you to track on who has access to the API.

I'm using 3Scale API Management infrastructure to secure the demos. 3Scale has a free base offering that allows anyone to get up and running requiring API keys, analytics and other essentials with very little investment.

Currently I have four separate deployment blueprints done:

All of these samples are in PHP and uses the Slim PHP REST framework. They are meant to be working examples that you can use to seed your own API deployment.

You can find the entire working repository, including Slim framework at Github.

API Deployment From MySQL, JSON, Github and Google Spreadsheets

I'm doing a lot more API deployments from dead simple data sources since I started working in the federal government. As part of these efforts I'm working to put together a simple toolkit that newbies to the API world can use to rapidly deploy APIs as well.

Currently I have four separate deployment blueprints done:

All of these samples are in PHP and uses the Slim PHP REST framework. They are meant to be working examples that you can use to seed your own API deployment.

I'm also including these in my government API workshop at #APIStrat this week, hoping to get other people equipped with the necessary skills and tools they need to get APIs in the wild.

You can find the entire working repository, including Slim framework at Github.

Mobile Editing of Google Spreadsheets in 45 Languages

Google has enabled many of the same mobile editing capabilities in Google Docs, for Google Spreadsheets.

You can now edit Google Spreadsheets in 45 languages on Android, IPhone and IPad devices.

Google has aslo added eight separate print sizes in Google spreadsheets, including tabloid, statement, executive, folio and A3, A5, B4 and B5.

Some new features that continue adding to Google Docs viability as a powerful publishing platform.

Visualize Big Data with BigQuery and Google Spreadsheets

Being the big data geek I am, I'm pretty excited that Google has integrated BigQuery with Google Apps Script and Google Spreadsheets.

This means you can take power of BigQuery for storing and querying of very large data sets and combine them using Google Apps Scripts with any of your Google Spreadsheets.

With a simple query Google Apps Script you can embed and process a query that pulls counts, fields and other information you use in spreadsheet columns and data visualizations. You can upload VERY large sets of data like the M-Lab dataset in BigQuery which contains 240B rows!

I think this is an excellent example of the power of cloud computing. You can take a web application like Google Spreadsheet and truly harness the computer power of the cloud and make sense of large data sets.

There is more information on their recent blog post BigQuery, meet Google Spreadsheets or visit the BigQuery website.

Google Cloud Platform - Spreadsheets as Data Stores

I attended many great sessions while at Google I/O. I really started to see the where they are going with using Google Spreadsheets as data stores.

I have used as data stores for quite a while, but they are really pushing for them to store enterprise data, application data, web site data, and much more.

Way back in the day when i was doing non-profit work I went and conducted a needs assement at a large educational non-profit. To build a specification for developming a new administrative system.

Went I first went in I was told there there is a central database system by the management. After conducting my multi day technoloy needs assessment I found out that the real staff utilized many many spreadsheets as data stores to get their work done. They rarely used the central system.

I find this to be common across many organizations.

This is why I think Google Apps has huge potential for the spreadsheets to be majorly adopted as data stores for the masses in non-profits, business, and enterprises.

I'll talk more formerly about this later.

Google Spreadsheets as a Data Store

I just finished BETA testing the DocSyncer tool which will keep mycomputer synced with my Google Docs account.

At first I had settings turned on to upload .html pages, which was a huge mistake....instantly it started publishing thousands of files of my local workstation, be careful.

Once I did get my 800+ word and excel documents published synced up without any effort I started looking through them and searching on topics. It was very interesting to find some items I don't even remember I have.

The I pulled out a simple JavaScript for pulling a JSON feed from a Google Spreadsheet and connected it to a list of businesses I had for the Portland area. Instantly I had published these business listings on the web using my Google Spreadsheet as a data store.

Very cool stuff, think it will be a while before everyone realizes the potential here.

If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.