Hear Chicago developer and OpenStreetMap U.S. board member Ian Dees talk about this weekend’s Chicago OpenStreetMap edit-a-thon.

image

“The first thing is … look at this url. That’s a big one.”

Demond Drummer just pulled up http://www.cityofchicago.org/city/en/depts/dcd/supp_info/large-lot-program.html, the official page for the city of Chicago’s Large Lot program. The program allows residents to buy $1 lots in the greater Englewood area.

Drummer, a tech organizer for Teamwork Englewood, walks through what it takes just to find if a lot is eligible for the program. Opening a PDF map of the program area, the city’s zoning map, a list of city-owned property.

“The whole point of the program is to lower the barrier of entry to get these lots back on the tax rolls, but the barrier to entry of the project itself is high,” Drummer said.

It’s why Drummer approached DataMade, a Chicago civic technology company, to build LargeLots.org, which makes it easier for residents to find out if they’re eligible for the program and then apply. The pilot program is accepting applications through April 21.

“You could figure it out on your own, but you have to download this spreadsheet from the city data portal with every city owned lot, compare it to this pdf map and then go to the city zoning map,” said Derek Eder, a co-owner of DataMade. “All the pieces are there, but having to go through the steps is a challenge.”

The Large Lots program was officially announced on March 20. That night Drummer got a text from Juan-Pablo Velez, another veteran of Chicago’s civic data community, asking if he was going to build something to make the process simpler.

The next day Drummer got in touch with Eder. DataMade has a contract with LISC to build tech solutions for neighborhood groups, and Eder had been bugging Drummer to bring them a project for a while.

On March 25 they met at the weekly Open Gov Hack Night Eder co-hosts at 1871 to start hashing out a plan. Drummer knew he wanted a map showing eligible properties and the steps to apply. Other than that he was open.

“Our understanding of the program changed, but these requirements (the map and steps) never changed,” Drummer said.

They ran the project by LISC, which agreed to fund the site through the contract with DataMade. The initial commit to the LargeLots GitHub repo came at 3:34 p.m. the next day. By the end of the day the site already had a basic map and an address search.

The first challenge was to get all the data in place (they list out everything in the site’s about section).

Eligible lots had to be in the pilot program area, owned by the city, zoned residential and without a building already in place. That meant the team needed a map of the area, a list of city-owned property and zoning information for Chicago.

Luckily most of the sets (city-owned property, zoning maps, Cook County parcel maps) were already available. The team also got a break when the city included property PIN numbers in the list of city-owned property, which allowed them to easily join it with county parcel data.

“That was fortunate, because we wanted to show the size of the lots, because the county just released its parcel data,” Eder said. “Most city datasets don’t have that PIN, and there’s no table with addresses and pins.”

The map of the Large Lot area only existed in PDF, though, so Eder map a shapefile of the boundaries using GeoJSON.io, drawing the boundaries by hand.

“I basically eyeballed it and had Demond check that it was right,” he said. “We loaded it into PostGIS (a geographic database) and did a geo merge to see what parcels were in that area.”

With a working site together, Drummer sent the demo version to Jeanne Chandler with the city of Chicago Dept. of Planning.

“She said, ‘Why is the map so small?’” Drummer said.

The team had mistakenly used a map of what they thought was the pilot area, which turned out to be only extra markup on the city’s map.

GitHub was the main communication tool for the project in addition to the code repository, so Drummer opened an issue for the map change on April 2, one week after work had started on the site.

“This was a big moment in the project,” Drummer said.

With the larger map the team switched some of the mapping tools it was using, settling on Leaflet to display the base map and CartoDB to display the buildings.

After that the main issues were keeping up with the city’s rules for the program and explaining them clearly.

“It grew with time, but the iterations came as we knew more about the application process,” Drummer said. “It was an iterative process for the city.”

LargeLots.org officially launched April 4, two weeks from the program’s announcement and 10 days after work started on the site.

Eder said the site’s gotten around 5,000 hits, most coming after it was featured on ABC 7 news.

“That was one of the coolest things because they showed the app first,” he said. “’Look at how easy it is!’”

So far most questions have been about how to go through the application process, not the site itself, which was the goal from the beginning.

“Everyone is more asking about policy questions, which is the kind of interaction we should have,” Eder said. “The focus should be on the policy, not the website.”

While the site has a short shelf life, Drummer said the project was worth taking on, especially if the program is expanded to other areas. The code for the project is open source, so anyone wanting to implement a similar site for a different problem could take it and run with it, Eder said.

One of the key successes for both Drummer and Eder was the speed and ease of starting the project through the agreement with LISC. Instead of a community group pitching a project to civic hackers to do in their spare time, Drummer could hire DataMade.

“We need this solution, and we have the resources to retain your time,” Drummer said. “That’s a big step.”

Eder sees it as a way of making sure civic hackers go where there’s need and not just invest in projects they happen to know about.

“There’s been some criticism of open government that it’s not available in areas that don’t already have the technology,” Eder said. “We’re in a position now as a community to bring these two worlds together and help with these issues they need help with.

“It’s the same principles, but it’s reaching an audience that needs it,” he said. “This was influenced so much more by real need, and it’s so much more powerful because of it.”

image

A truck drives over a stretch of South Ashland Ave. (Chris Hagan / WBEZ)

The snow has mostly melted, temperatures are stubbornly warming and Tom Carney is hoping his crews can stop filling so many potholes.

Carney, CDOT Deputy Commissioner for the Division of In-House construction, has overseen the most active pothole season in recent years. He’s looking forward to moving from potholes to construction soon.

“Ideally the weather will keep warming up, we’ll continue to work down the higher number of potholes in the system,” Carney said. “We’ll be transitioning to start construction season work, and some will be milled and repaved.”

April is typically when pothole complaints and pothole filling starts its decline, though Carney isn’t making any predictions after Chicago’s third coldest and snowiest winter on record.

image

The city attempted to get ahead of the problem, deploying ‘strike teams’ earlier than normal. Carney said his crews have also focused on arterial streets every Friday and Monday.

The result has been a lot of work for Carney’s workers, though he’s enjoyed pulling work reports every morning.

“One benefit of it, our crews’ production has gone through the roof,” he said.

Crews filled more than 145,000 potholes in March, according to 311 reports on the city data portal. It’s the highest figure since 2011, the latest date available on the portal.

No street took more abuse than South Ashland Avenue.

While the 13-mile stretch has led Chicago streets in potholes filled the last two winters, this year problems have exploded. Crews filled more than 25,000 potholes on South Ashland from October to March, four times more than any other street.

image

When asked why South Ashland, Carney had a simple answer.

“Have you ever driven down South Ashland?” he asked. “It’s extremely bad.”

Still, extreme events are always more complicated than just a bad stretch of pavement.

The worst stretch was between 40th and 50th streets, which alone had 10,000 potholes filled. Carney said that overpass work at 39th Street pushed traffic down to a single lane much of the winter, concentrating it on certain spots in the road.

image

A corner on South Ashland Ave. breaking down after a historically bad winter. (Chris Hagan / WBEZ)

Combined with the abnormally cold and wet winter, the extra traffic helped break down the street more often than usual.

“When you start that far in disrepair and add in the brutal winter we’re having, it’s a perfect storm on Ashland,” Carney said. “There’s a lot of truck traffic, a lot of bus traffic. That street has taken a pounding.”

See where the most potholes have been filled this winter

For the past month Josh Kalov has been digging through Cook County’s public data, cataloguing what’s available and what maybe should be.

It’s something Kalov has a lot of experience with. He was part of the team that built School Cuts, and is a veteran of Chicago’s Open Government community.

The only difference is that now he’s doing it from the inside, as a consultant with the county through a contract with the Smart Chicago Collaborative. It’s a big personal move, as Kalov leaves a job at NAVTEQ to start his own consulting firm, Kalov Strategies.

“It’s definitely something I’ve been looking-but-not-looking- for,” said Kalov. “I was happy in my job, but working on SchoolCuts on the side. Having this opportunity helped me be able to actually quit the fulltime salary job.”

Cook County and Smart Chicago formed a partnership in January, building off the county’s 2011 Open Government ordinance (Disclosure: Chicago Public Media is also a partner with Smart Chicago). Kalov started working with Smart Chicago at the beginning of March.

Only one month in, Kalov’s role is still being defined. Most of his work has been identifying what the county has made available on its data portal.

“Right now I’m doing like a data census, just looking through what’s been released and the problems with it, the cleanup stuff,” he said. “Things that are missing attribution or were posted three years ago and not updated.”

Eventually the plan is for Kalov to advise the county on new datasets to release and help automate updates. For now he’s keeping an eye on news stories and reports from organizations to see what data people are interested in and may want more of.

“There’s a lot of stuff people have wanted, like the parcel data that was released recently,” Kalov said. “There’s a lot of stuff where I don’t know who’s the owner of the data yet. It’s the same thing everyone else is doing, but with the power of the President’s office behind it.”

Before NAVTEQ, Kalov also spent four years working in the GIS department at Kankakee County, giving him experience inside and outside government as he starts work on his new role.

“It’s interesting perspective, especially right now since this is kind of a hybrid perspective,” he said. “The larger view of the datasets themselves and the scripts and automation, but also thinking about it from the individual use case, so I think that helps.”

It also means he knows the difficulties of working in the county structure and what it’ll take to push the partnership forward.

“In a county you have multiple elected officials so there’s no one person who has power to bring all those departments together,” he said. “So the president can’t demand that the assessor participates in something or the clerk participates in something. Elected officials will say they only report to the voters, so there’s no guarantee that you’ll get that coordination, where in the city that’s not the case.”

One of Kalov’s goals is to increase cooperation when it comes to releasing data, including reaching out to governments within Cook County other than Chicago.

“Getting started with our collaboration with the city, I’d like to hopefully expand that to other organizations and municipalities,” Kalov said. “I have no idea the feasibility, but looking at if there’s a way for other cities to load their data to the county, especially the smaller ones that can’t afford Socrata.”

Divvy announces winners of its first Data Challenge:

President Obama wants more people thinking about climate change and sees data as a key part of making the general public more aware of the issue.

This week the White House launched a website to make climate data more accessible. It currently has datasets relating to coastal flooding and sea level rise from NOAA, NASA, the U.S. Geological Survey, the Department of Defense and other agencies

"Unlike most other datasets, this has always been available," said Rao Kotamarthi, an atmospheric scientist with Argonne National Laboratory, speaking on Afternoon Shift Thursday.

"This brings these together and provides maybe some modeling tools to make it more available, maybe not for people in the scientific community but the user side," he said. "It’s a gesture from the White House to put some attention on the issue."

Kotamarthi said he hopes the new data will attract partnerships with private companies, the same way weather forecasters use public data from the National Oceanic and Atmospheric Administration (NOAA).

"NOAA does all the weather forecasting, but the people who present it to you on a daily basis are the Weather Channel, TV stations,” he said. Maybe there’s a better way to present the risks [of climate change] than the scientific community has come up with.”

While most of the data has been around, there are a few new bits hidden in the 83 released this week.

"One interesting dataset not commonly available I see on here is critical infrastructure that has been declassified,” Kotamarthi said. “Maybe you can come up with a visualization that will show what would happen to that with sea level rise.”

He sees the site and partnerships as possibly changing policy down the road, mostly by making the issue easier to understand by the general public.

"It can influence the policy in the sense that once everyone can get a clear visualization and not so abstract it can help you form a realistic view of what will happen in the future," Kotamarthi said. "I see the reports and you can too: they’re hundreds of pages. To distill them into a simple visualization, there must be someone out there who is much more clever you can present that in a better way.”

The Divvy Data Challenge (find out more here) officially closed March 11, and according to Divvy they got 99 entries.

While the winners won’t be announced for a while, some of the entrants have started posting their work online. Below are some of the early standouts to get a taste of what might be coming.

Thanks to Chicago Data Viz and the #DivvyData hashtag on twitter for making these easy to find.

image

This first one comes from Mike Freeman, and allows you to look at the total number of trips taken by day, hour and gender. You can then drill down by selecting specific variables to see how the data changes.

image

Datascope’s entry uses a voronoi heat map to track how riders move between stations. Find out where riders from Streeter Drive end up and how many tourists bike to Cubs games.

image


Drew Priest put together a really interesting project that shares insights but also hypotheses about what the stat may mean. He also has a great chart comparing rides to daily weather data.

image


Don Drake put together a visualization with a series of charts and a map documenting all the trips but also allowing you to filter by certain aspects.

Obviously this is just a small part of what will eventually come out of the project, but interesting to see the connections people are already making.

wbez:

So far this season Chicago has experienced one Dennis Rodman of snow. It certainly feels like it.