Finally, I think I got the seats data for the 2015 general election sorted out!
I worked on it yesterday so tonight (day 23) I need to test it and make sure it’s validating properly.
Also, I had a chance encounter with a friend and had a chat about this little experiment of mine 🙂 as well as a bit broadly about the current political climate (sigh).
I’ve started work today on a script to automatically generated the seats data needed for the election data project. With that done, future elections should be easier to put together.
Having stayed up all night to watch the election (depressing as it was) I took the opportunity to collate the general election results.
The project gives us an open data source showing the results for each constituency, and the seats. It’s not quite finished, and I plan to add JSON as well as CSV versions too.
This is partly the reason why I love Lichfield so much. Our council CEO recently joined Twitter and was last night tweeting (much revered) webmaster, Stuart Harrion;
As part of the Making a difference with data project I was asked to run an ‘unworkshop’ for the West Midlands which pulled together hyperlocal site owners, local government folk and data geeks.
Thanks to Nick Booth‘s generosity we had a suitable venue where we could gather. We had 16 people altogether and a good mix of backgrounds. It was also encouraging to see so many journalism students running hyperlocal sites in attendance.
Objectives for the evening were fairly simple: to find out the most important issues to communities, what information pertains to those issues, who holds that information and if it’s available, then how do we use it and if not what exactly do we want.
We started by brainstorming the most important issues to the community and ended up with the four big ones being;
Jobs & benefits
Anti-social behaviour
Budget cuts
Built environment
We split into four teams with each team looking at one of these issues and went away to discuss them and our objectives for the night.
The group looking at this decided on a few key starting points;
Information about available jobs is live and rarely out of date.
There is a difference in quality of information between jobs put out by the private sector (e.g. recruitment agencies) and the public sector (i.e. Job Centre) where agencies typically mask employer details. It’s unlikely that change will be affected here.
Public and private sectors have different agendas – i.e. Government need to encourage employment.
Job centres are policing rather than constructively seeking jobs for people whilst confusing people with a three-layered IT-based system.
With this, these points about necessary change came out of the discussion;
DirectGov database needs to be more open, instead of hidden behind the current 3-layered, difficult to use interface.
Provided as open data, the database could be formatted into more usable applications.
The incentive for this change is the need to increase employment levels and do that better than the private sector.
Obviously a big issue at the moment this group had a wide-ranging discussion. Some key points to come from it were;
Data could be collected in different ways which has an effect on the consequences of that data.
Data is not ideologically or politically neutral
Complications arise with linked data – i.e. as soon as you have one set of data it’s likely you’ll want to explore that data but will need another set of data to do so. This process repeats itself making a single issue more complex just because of the effort involved in analysing the relevant data.
Data rarely comes with explanation of what it is, why it’s been collected and using which methods.
Information is data with added opinion.
People approach budget cuts with their own opinion and seek the data to confirm that.
Food for thought from this group then;
How can we make linking data easier, and communicate that without overcomplicating it?
The 5-step process encouraging ‘just get it out there’ is great but the data now needs explaining.
It costs money to find out about empty properties – why is this, what is the cost for?
Where’s the cause and effect with empty buildings?
Is a property empty because of planning permission?
Some thoughts on possible solutions/advances;
Planning notices are placed on lamposts – these should be available as open data (i.e. we shouldn’t need scrapers like Planning Alerts).
Mapping planning applications could play a big role in providing information on properties.
Re-purposing should be a consideration. Birmingham City Council is doing this.
Housing exchanges should be looked at where two council tenants wish to move to another local authority area.
Conclusion
All in all a good bunch of thoughts and for me we can boil a lot of this down to five points that need acting on;
We need more open data – we have been given a lot but there is more out there and open data should be the default.
But we need context – data can often carry an agenda with it so we need context such as why the data was collected, who by/for otherwise how can we trust the data?
Linking data should be easier – the concept of linked data is all very well but there are very few people with the know-how to actually do it.
Data empowers community solutions – issues such as empty buildings and the lack of a home for community groups can be solved if the relevant information was freely available in an open format that could be interrogated.
Training is a must – we have a lot of data, we need more and we need explanations with it to provide community solutions to community problems but we need the knowledge to retrieve, link and interrogate data effectively.
A big thank you to everyone who came and gave up their evening. Especially to Nick for providing his office as a venue, to Nicky for shooting the videos andMichael Grimes for his notes.