Hack for MN 2014 Open Hack Projects

We’d like to thank Tech.MN for recording all of the project presentations.


Aiming to create a single API of air quality information based on many different data sources across the U.S. and the world, to make it easy for designers and developers to include air quality information in their apps and sites. By the end of Hack for MN, the API could provide information from approximately 3000 stations across North America, and the team was working on pulling in air quality information from the U.K. and Germany.

The team also created a demo web and text application to map, visualize, and retrieve air quality information based on the Airstatus API. Besides demonstrating how to use of the API, this application also provides an environment for the team to experiment with ways of displaying highly technical information to the general public.


This project, which was first pitched at CityCampMN 2013 hackathon, started with the task of creating a single place for residents to be able to view and plan informal activities at city parks, like pickup basketball games. The team was also motivated by the questions of how to engage youth in nearby activities, and how to help individuals meet people at parks.

“It’s like a Yelp for activities”

The bulk of team PLANIT’s work during Hack for MN involved creating a dataset and map of Minneapolis parks that includes information on activities and services provided at parks, based on combining data from the City of Minneapolis and Hennepin County.

With that accomplished, the team now wants to build a way for individuals and residents to be able to submit their own activities, and for folks to be able to use this platform via text messaging.

The API Store

Continuing their work from Capitol Code/CodeAcross 2014, the API Store team has been tackling the business, policy, and political issues that come about when a government decides to make its data openly available as downloadable datasets or APIs. Having established the framework of open data publishers and subscribers, the API Store team has begun to identify the expectations and responsibilities of these two groups, and the characteristics of the relationship between these two groups.

The API Store team has examined two important features that a government open data portal should have: a catalog of all open data provided by that government; and analytics on the use of published data for the benefit of both publishers (to understand what datasets are popular and the sources of extreme demand) and users (to better understand performance issues of applications that use open data).

Voter Guide

Building upon the crowd-sourcing and visualization work of a Capitol Code 2014 project, the Voter Guide team began by seeking to expand the collected social media accounts of Minnesota politicians to include 2014 candidates for office. Once they expanded their scope to candidates, the team decided they could also expand their scope to create a full guide for voters on all of the candidates that are running in their district(s), beginning with a site that makes it easy for voters to find the social media accounts of their candidates.

With a Drupal site running that does just that, the team is now looking at what other information would be useful to include for voters, including official information (such as campaign contributions). The team is also planning on creating an API so other sites and developers can easily incorporate this important information.

Homescreen for the Homeless

Based on the observation that smartphones are becoming more prevalent among homeless populations, and that basic information about resources available for the homeless can be very hard to find, this team created an Android app to provide a simplified listing of social services and homeless assistance programs.

Homescreen for the Homeless organizes and lists phone numbers, URLs, and apps that can connect homeless populations with various resources, such as shelters, health and legal resources. These resources are organized based on focus or situation, and intelligently integrate with the smartphone to form appropriate actions, such as dialing a phone number or opening an installed app.

With a prototype of the Android app built, the team is now looking at the problem of gathering lists of homelessness and social service resources. To that end, the team is planning to crowd-source phone numbers, URLs, and apps that can connect homeless individuals with resources they want or need.

PyLadies Web Scraping

By combining data from the Meetup API with the power of Python web scrapping, the new PyLadies Twin Cities group wanted to track, analyze, and visualize the representation of women in Twin Cities technology meetups. And ultimately, they want more women to be part of the Twin Cities’ technology community.

To begin with, the PyLadies have established a set of metrics to use when analyzing the populations, attendees, and speakers of Twin Cities technology groups. The team also spent the weekend learning how to scrape webpages using Python and Beautiful Soup.

Going forward, the team will present its first analysis of a technology group at the July Twin Cities PyMNTos meetup. The team also plans to analyze other Twin Cities technology groups, and to apply text analysis to understand how men and women describe themselves in Twin Cities technology groups.