T shirts – You Decide

If you don’t know that the Open Data Manchester has been involved in the development of the Manchester Hackathon, where have you been? There will be over sixty coders, creatives, journalists and activists all getting down and dirty with Manchester’s open data. To mark the occasion Open Data Manchester is going to have some T-Shirts made. Would you like one?

Below are some designs, help us choose by entering 1, 2, 3, 4 or 5 in the reply box at the bottom of the page.

 

1.

2.

3.

4.

5.

The Economics of Open Data

Data doesn’t make for a very good tradable commodity. It’s benefits spread well beyond the people who trade in it, it’s almost impossible to stop people from copying and sharing it, and it can be enjoyed by multiple people at the same time.

In a post written for Open Data Manchester on The Economics of Open Data, regular member Robin Gower, explains how these characteristics mean that data will have a much greater economic and social impact if it is made available as open data. He also discusses the implications for established “closed-data” business models and for the government.

The Manchester Hackathon – 17th November

Calling all hackers, coders and creative collaborators – Manchester needs you to shape the future of the digital city.

For the first time ever, the City of Manchester invites you to dig underneath its digital skin. FutureEverything, Open Data Manchester and Manchester City Council are looking for experts and innovators to hack, code, programme and experiment with the city’s sets of open data to build new applications and develop future services.

Utilising the open data sets from DataGM made available by Manchester City Council and public sector partners, participants are welcome to produce anything they wish – develop applications to help people find their way around, stay safe, discover new experiences and everything and anything in between. All data is released under the Open Government Licence.

Taking place at MadLab in the heart of Manchester’s Northern Quarter on Saturday 17th November, the Manchester Hackathon is set to be an intense, productive and exciting collaboration between the brightest minds in software development and data processing. Entries from both teams and individuals are welcome, and there are cash prizes to be won for the best product at the end of the session, including;

    • Grand Prize – £4,600*
    • Best Under 21’s Creation – £600
    • Best Visualisation – £600
    • Best Locative Application – £600
    • Developer’s Prize – £600
    • Best Solution for an Identified Problem – £600

 

* £1000 prize & £3,600 development funding

The event is completely free to enter and open to all. Register HERE

The prizes will be selected by a panel of independent industry experts, including Dave Carter (MDDA) and Lou Cordwell (magneticNorth).

The Hackathon takes place on Saturday 17th November 9am – 7pm, with a warmup and networking session beforehand at MDDA (Lower Ground Floor, 117-119 Portland St, Manchester, M1 6ED) on Friday 16th November 6.30 – 8.30pm

 

The Manchester Hackathon is partially funded under the ICT Policy Support Programme (ICT PSP) as part of the Competitiveness and Innovation Framework Programme by the European Community.

ODM – September 2012 Edition

6.30pm – 8.30pm, Tuesday 25th September
Venue: MDDA, 117-119 Portland Street, Manchester M1 6ED

Sign up on Eventbrite

After a brief summer hiatus Open Data Manchester is back and temporarily at a different venue.
The last event saw James Cattell from Digital Birmingham, Andrew MacKenzie from the UK Governments Open Data User Group and Jag Goraya from GIST Foundation in Sheffield talking about how open data initiatives were developing in Birmingham and Sheffield and Birmingham City Council’s adoption of a corporate open data strategy.

Since the last meeting there has been quite a bit of activity mostly around some forthcoming hackdays and support for open data initiatives in Manchester. Last Tuesday we had the launch of Tech Hub Manchester in Carver’s Warehouse on Dale Street, Manchester. This is going to be a new co-working space networked into Tech Hub London and a wider international digital start-up community and we will be having the Tech Hub people coming to talk about the initiative and Start-up Weekend a two day hack event utilising open data.

The City of Manchester is also looking at developing open data as far as part of a new Technology Strategy Board – Future Cities Demonstrator project. This is a large £24 million fund that will help the creation of digital services within the city. Anne Dornan who is working on the project will explain how open data fits into this.

If you are interested in public transport, and a lot of people are, Move*Manchester is an Innovation Challenge that will be running in March 2013. The planning is being finalised, but it will entail a weekend event based around a hackathon that will lead to product development and support. The prize fund and support package to develop products and services will be approximately £35,000 and is part of the CitySDK programme run by FutureEverything and Manchester City Council with the support of Open Data Manchester. More details to follow.

Also we will be looking at the latest data releases on DataGM, TfGM, cool developments and anything else people want to show

Open Data and the Personalisation of Experience

Earlier in July at SMC_MCR, a monthly digital and social technology meet up in Manchester UK,  BBC R&D demonstrated a new approach to personalised entertainment called Perceptive Media. It is something that BBC producer Ian Forrester had been talking about for some time, being revealed at SMC_MCR in February. At that point it was hard to understand what the concept entailed. It was explained as a way of delivering media that was tailored to individual preference and environment but little else.

 

On their return in July the team showcased a short radio play demonstrating some of the concepts of Perceptive Media. The play can be found here http://futurebroadcasts.com/ At first listening the play seems to follow the traditional radio play form, but within the play there are certain personalisations that are based upon the location of the listener. After a couple of listenings it is quite obvious where the personalisations exist. As Ian Forrester stated in the Q&A, it was a fairly basic demonstration of the technology pointing to the challenges of narrative personalisation and the ability to create these personalisations ‘on the fly’, in the browser. Even with such a short and somewhat basic demonstration of Perceptive Media it is easy to see how it could develop into a more complex form cutting across platforms.

 

The personalisation aspect of Perceptive Media comes from the creation of a narrative framework that allows certain variables to be inserted, with these variables influenced from the data that the Perceptive Media storyteller has access to. In the case of ‘Breaking Out’ – the play in the demonstration – the data accessed was local weather, listings and local news. As more data is made available it is easy to see how it could be integrated into a Perceptive Media framework. The demonstration offers a glimpse into a new form of story telling based on an individuals location and environment and if coupled with personal data – preference and situation.

 

In 2009 at FutureEverything there was a presentation by Philip Trippenbach, then at the BBC, about the construction of narrative in games especially first person games. He highlighted a game called ‘Six days in Fallujah’ which he writes about here http://trippenbach.com/2009/06/09/six-days-in-fallujah-and-the-dirty-g-word/ What I find of interest is the possible use of the form to be educational, to disseminate news and information in a way that many would be uncomfortable with. What Trippenbach talks about is a personalisation of experience, a certain ‘being there’. The use of real situations to create realistic gaming experience is not new but a concerted attempt to create ultra-realistic gaming through streaming of real-time positional and telemetry data from Formula 1 Grand Prix was attempted in 2006. http://news.bbc.co.uk/1/hi/technology/7440658.stm Although as the article states it would probably only be of interest to hardcore gamers, it offers fascinating possibilities about could be achieved at this intersection of gaming, personalisation and data.

 

Although not using open data, a great example of this was demonstrated at FutureEverything in 2011. Arcade Fire’s – We Used To Wait scores a personalised film called The Wilderness Downtown by Chris Milk developed in association with Google Labs. It invites the user to enter the address of where they grew up and then the HTML5 based experience literally flies. You can try it here: http://www.thewildernessdowntown.com/

 

Data both open and personal is at the centre of the personalised experience whether it be local weather, what food we like, position of racing cars, location of where we once lived or the environment in which real-life situations were played out. We are starting to see a new world where the way information is delivered to us is adaptive, often in real-time and just for us. It might not be to everyone’s liking but it is happening, just look what Google are doing: http://www.google.com/landing/now/

 

Disclosure: Julian Tait is a co-founder of SMC_MCR and content programmer for FutureEverything

Open Data Manchester March meeting

March’s meeting was an opportunity to help shape Manchester City Council’s forthcoming open data Hackathon. Stuart Baldwin – an ODM regular – spoke about Manchester’s plans for an event in October to coincide with the Manchester Science Festival.

The driver behind this is the recently announced Manchester Digital Strategy and a recent trip that Chief Executive of MCC, Sir Howard Bernstein made to New York. Whilst a guest of Mayor Bloomberg, Sir Howard was apparently impressed with what New York was doing with their open data initiatives such as 311 and App Challenges.

Open Data Manchester and MDDA advised MCC, that for a Hackathon to work it needed to work with the developer community to make the event relevant and developer friendly.

The conversation was mainly focussed on the types of data that developers wanted releasing and there is a list from Duncun Hull @dullhunk here

What was notable was the willingness to listen to what the community wanted and by suggestions from MCC itself, such as Contaminated Land data which has traditionally been contentious.

[vimeo http://www.vimeo.com/36540620 w=400&h=300]
Visualisation by Jonathan Fisher more details here

After the Hackathon discussion attention focussed on Road Traffic Collision data and the work that Steven Flower, Jonathan Fisher and Jonathan S. – There has been discussion about forming a sub-group around RTC data and its use. So if people want to get involved in that contact Steven Flower on the Google Group. Jonathan Fisher’s visualisations where discussed and also the variation in data quality that exists. It was noted that although data was provided to TfGM who collated the data for the Department of Transport. Different flavours of the data existed in different places. TfGM upload monthly data to DataGM which lacked detail on casualties and vehicles involved. The complete RTC data gets forwarded it to the DfT who then make it available via the DfT website and and data.gov.uk with more detail but in two different versions. We are trying to find out why DataGM only holds a less detailed version.

January meeting with TfGM

January’s Open Data Manchester was a transport special, with Craig Berry and Dave Busby from TfGM giving an update as to the types of data that TfGM hold, and what they are trying to release. Open Data Manchester people may already know of Craig Berry as the Information Manager who has been tasked with identifying and releasing open data. Dave Busby’s brief is for integrated ticketing and real-time information.

TfGM reinforced its position with regard to open data at the meeting. There has been a number of rumours over the past twelve months as to what the organisation was trying to release to DataGM – Greater Manchester’s open data portal . TfGM are currently releasing data with regard to bus schedules, NaPTAN stop locations, fixed and mobile speed camera locations and monthly Road Traffic Collision updates. There had been mooted some realtime data would be released.

Greater Manchester has been crying out for an intelligent integrated ticketing system. To many a lack of such system has made travel by public transport around Greater Manchester more difficult than it should be. To this end TfGM are developing a specification that will go to tender in the 1st half of 2012. The system will initially cover Metrolink and then encompass Greater Manchester buses. The system will use contactless technologies in a similar vein to TfL’s Oyster Card but with the added functionality of being able to use contactless bankcards and NFC phones. It was interesting to note the certainty that NFC will be adopted, by most handset companies within the next year. Paying by Google Wallet was also mentioned as a possibility. The ticketing system will also have fare rules that will calculate the best price for journeys undertaken.

Although getting Integrated ticketing to work with Metrolink would be a relatively easy task and a useful test bed to prove the utility of the system, getting Greater Manchester’s 40+ independent commercial bus operators to adopt the system maybe more challenging and may need a certain amount of political will. Anonymised journey data from the system or personal access to journey history wasn’t discussed in detail, although the later seems to be fairly standard in smart ticketing systems, access to anonymised data could offer huge potential for applications and services that look at gate loading on routes, passenger density etc.

The advent of the oft mooted, realtime data from TfGM looks closer – although there was no specific timescale mentioned. There will be access to the Metrolink Passenger Information Displays data, although how this will manifest itself is uncertain. Developers present at the meeting suggested that JSON would be preferable. The main challenge with accessing real-time Metrolink location data is that the Tram Management System currently being implemented isn’t currently functioning throughout the network. The initial release of data will cover the South Manchester line and Eccles lines.

Although it doesn’t look like there will be any real-time bus data soon, TfGM would like to release the location information of the free Centreline buses that are being operated on TfGM’s behalf. This data will be location data that won’t identify the actual service the bus is running. It was suggested that as there are only three distinct Centreline routes it wouldn’t be that complicated to identify, even where the routes overlap. There is also an Informed Personal Traveller pilot that is being run in Bury by Logica, ACIS and First Bus. It uses a number of technologies including an AVL system that has been fitted to approximately 100 of their buses. The IPT application hasn’t been released yet and there are indications that the system is closed.

TfGM recently submitted a bid to the Local Sustainable Transport Fund and written into it is the provision of open data and the development of an intelligent multi-modal Journey Planner pulling all relevant data that TfGM has at it’s disposal, how developers could access the Journey Planner was discussed and whether it would exclude the provision of other types of journey data.

There is a move to make other data available through the LSTF, these include Car Park updates, real-time disruption data, journey down roads data and feeds off TfGM’s SCOOT adaptive traffic control system. SCOOT controls half of the approximately 2000 traffic control signals in Greater Manchester.

The lack of transparency with regard to bus fare structures within Greater Manchester has been a subject that has come up many times, especially regarding anecdotal evidence that dependant communities are charged more per mile than others having viable transport alternatives. TfGM stated that Greater Manchester is one of the few places where bus travel is generally more expensive than rail. To this end TfGM are interested in developing a project similar to one that Open Data Manchester was developing over a year ago that encouraged travelers to submit the details of their journey and price.

At the close of the discussion TfGM were encouraged to use the Open Data Manchester Google Group as a resource to ask questions and to highlight initiatives and challenges.

Making Open Data Real consultation results published.

The results of the last year’s Making Open Data Real consultation have been published. Open Data Manchester submitted a response as did 246 others.

These responses will be used to define the governments approach to Open Data and will hopefully bring about a meaningful push from both central and local government.

The following Greater Manchester based organisations responded:

  • North West e Government Group
  • Open Data Manchester
  • Rochdale Council
  • Swirrl IT Ltd
  • Trafford Council
  • Transport for Greater Manchester
  • So far play to the above for the above for engaging.

    Summaries of the consultation can be read here:

    Full responses can be downloaded here:

    Open Data Manchester – November Meeting

    November’s Open Data Manchester.

    Paul Gallagher – Head of Online for Manchester Evening News gave a presentation regarding the role of the MEN during the Manchester Riots. He described how the Manchester Evening News had used Social Media during the riots and how his team had started to collect data regarding the riots and the subsequent court cases to give insight into some of the possible causes of the riots.

    Most interesting was the resources that the MEN had put into reporting on the court cases following the riots and by having court reporters sitting in on each of the trials they created a schema and dataset show the areas that people lived in, mitigating circumstances, age, type of offence, sentence etc. This is data that can only be created if you attend the trail. This allowed them to map offences against depravation indices and changes in the way that sentencing was delivered over the course of the trials.

    The discussion also touched on news organisations becoming huge archives of sentencing data and how this can effect people’s lives even after their convictions have been struck off. MEN does have a policy where certain details are redacted from the historical archive but this is done on a case by case basis.

    There was also an update as to the preparations for the International Open Data Hackday and the responses to the Governments Open Data and Public Data Corporation consultations.

    Making Open Data Real – Response from ODM and Open Data Cities programme.

    27th October 2011

    The benefits of adopting open data for the purposes of transparency and accountability have been well documented, but open data is not just about transparency and accountability. We live in a modern technologised society and we need to give people the tools to navigate through our modern data driven environment, whether it be access to transit data, gritting routes or ‘infrastructural’ data such as mapping, hydrology or weather.

    We strongly argue for an open by default position with exemption being justified due to security or privacy. This is key as it is virtually impossible to predict what the utility of every dataset will be. It is obvious that certain ‘high value’ (Those that are perceived to improve ‘quality of life’ decisions) datasets will be adopted and used relatively quickly, but some will get used seldomly and many not at all – this doesn’t discount their value, as data has to be seen in the broader context of knowledge and future conditions may make certain datasets more relevant.

    It is also important that any body that delivers service on behalf of the public is also required to be open. For example Manchester is straight jacketed by a fragmented public transport system that has 40+ bus operators all supposedly in competition. Crossing the city may take multiple tickets from multiple operators. There is no motivation for operators to release information as to their fare structures although it has long been identified that having a transparent fare structure enables people to budget, plan and use public transport with confidence. At the moment you can only find out a fare by stepping on to the bus or ringing the operator directly. Although some bus operators do see the value of opening up this information, in meetings concern has been raised by certain operators about wholesale release of data allowing other operators to undercut prices – which is the idea of a deregulated system and local councilors being able to see how much they charge – which goes against the idea of delivering public service and being accountable.

    There is a case that Land Property Registry data be made available. Speaking to Local Authority colleagues there is an issue regarding the tackling of housing benefit fraud where claimants might have property in another borough and the potential of combating certain money laundering activities – It might also of effectively tackled the abuse of second home allowances by MPs before it became a major issue.

    We need to encourage a transition to a more intelligent and aware data policy. This cannot be done in one fell swoop but needs to inform procurement, so when IT systems are upgraded the ability to express data openly from a system would be specified. The adoption of common data release schedules is to be encouraged, especially where you have metropolitan counties such as Greater Manchester. Our colleagues at Trafford MBC, who we were in partnership with, in developing DataGM identified this as an important way to get cross authority collaboration on dataset release.

    There is a very important benefit from having common data release schedules. At present it is very difficult for developers and digital businesses to make certain open data based applications beyond proof of concept due to the market for open data applications and services being nascent. Common schedules allow development of products that can quickly find a critical market mass, this in turn validates the demand side argument for data.

    The public sector is logically the biggest user of its own data but data that is closed and siloed is often dumb data. We hear countless examples of dumb data policy: where local authority officials can’t find the data that they require – so creating an environment for ad hoc duplication and standards, in Greater Manchester this is estimated to cost many millions of pounds of lost personnel hours, and where local authorities might be operating multiple – up to 30 in some – GIS systems all with their own licensing agreements and interoperability issues.

    There has to be an adoption of common standards and these have to be non-proprietary, open and extensible. Although there is certain resistance to the adoption of Linked Data, mostly due to people not fully understanding the concept and need, with the explosion of data enabled devices, the need for computers to interpret complex data environments is becoming more important. Government has to be a major player in this space it also has to be intelligent in how it ensures compliance. Open and extensible formats offer a certain amount of future proofing over proprietary formats

    A concern that we hold, especially in light of participating in the EU smart city programme, is that within the UK there doesn’t seem to be much appreciation that open data is an enabler of Smart City and other technologies. Common technological frameworks that allow the development of city-based services across territories are being developed, building larger potential markets for products. What might be unviable in one territory might be viable at scale.
    Future technological developments such as the Internet of Things might be hampered if there is pressure to license and charge for certain ‘infrastructure’ datasets. Certain IoT devices have to be aware of where they are and how they are functioning in relation to public infrastructure and data.

    We strongly feel that we are coming to a point where we see a transition to Government as a platform. This will enable development of services from both within the public sector and outside. Open Data could be seen as evidence of a healthy functioning platform based structure, where the boundaries and interactions between citizen, government and business are porous, diffuse and bidirectional.

    Access to information is key to the re-enfranchisement. Open Data has the potential to create a more equitable environment for participation. Although it would be naive to believe that opening up data will automatically create a data aware citizenry, it only needs a few people who have the skills to mediate information in their communities to raise awareness and participation.

    We believe that for Open Data to become sustainable we need to be able to not only encourage the supply side but that of the demand side for data as well. Where market failure occurs or where there is nascent development of a sector, there is a need to stimulate activity to drive awareness, create services and applications and develop a base layer from which further development can be derived. Innovation challenges and focused development days are two of the things that can help drive this. There needs to be support for initiatives such as Open Data Manchester, Open Data Sheffield, Open Data Brighton and now Open Data Hull. Often, as in the case of Open Data Manchester and the Open Data Cities project from which it was derived, there is no resource support from the public sector and this is unsustainable.

    Julian Tait
    Open Data Manchester/Open Data Cities

    Online Response

    1. Do the definitions of the terms go far enough or too far

    Engaged citizens need to have access to the structure of our cities. This isn’t jut about league tables but one that allows people to move seamlessly through their modern data driven environment

    There needs to be an additional category of open data that focusses on the open data that enables people to navigate through the modern data driven environment, whether it be access to transit data, gritting routes or ‘infrastructural’ data such as mapping, hydrology or weather.

    2. Where a decision is being taken about whether to make a dataset open, what tests should be applied

    Whether the dataset or ‘datastream’ is being produced to enable the delivery of public services or as in the case of transportation data whether the data produced is for the purposes of disseminating information to the public enabling them to access service more efficiently – EG Transport Executive producing RT bus data that will enable people to use mobile devices to access service saving the capital outlay of investing in realtime bus signage.

    3. If the costs to publish or release data are not judged to represent value for money, to what extent extent should the requester be required to pay for public services data and under what circumstances

    The terms for value for money can be vague and encourage abuse. A test should be whether the data holder is creating the data for the delivery of their own service rather than explicitly for the request.

    4. How do we get the right balance in relation to the range of organisations (providers of public services) our policy proposals apply to? What threshold would be appropriate to determine the range of public services in scope and what key criteria should inform this

    All services that are delivered on behalf of the public should be covered. If a public service uses the data for the delivery of its own task then it should be made available

    5. What would be appropriate mechanisms to encourage or ensure publication of data by public service providers?

    We need to encourage a transition to a more intelligent and aware data policy. This can not be done in one fell swoop but needs to inform procurement. So when IT systems are upgraded the ability to express data openly from a system would have to be implemented

    1. How should we establish a stronger presumption in favour of publication than that which currently exists?

    Emphasis needs to be changed to one where exemption from publication is the exception and sufficient rigorous justification is needed

    2. Is providing an independent body, such as the Information Commissioner, with enhanced powers and scope the most effective option for safeguarding a right to access and a right to data?

    Enhancing the powers of the Information Commissioner is crucial in this process. It is also the ICO becomes a key motivator to creating an open by default policy. The ICO would then be able to put pressure on public bodies to standardise the way that they create data ideally bringing about a more intelligent public data environment

    3. Are existing safeguards to protect personal data and privacy measures adequate to regulate the Open Data agenda?

    Protection of personal data and privacy is vitally important and their has to be real teeth regarding organisations both public and private that transgress these rules. There also has to be an understanding that networked technologies will circumvent many safeguards

    4. What might the resource implications of an enhanced right to data be for those bodies within its scope?

    The enhanced right to data could if implemented wrongly be very resource heavy. If the starting position of public bodies are the biggest users of their own data and the present systems in place for shared intelligence and services is fundamentally flawed and there needs to be change. Example one local authority uses 30 separate GIS systems with each departmental head believing that theirs is the best. If you get it right for the public sectors own use the rest is easy.

    5. How will we ensure that Open Data standards are embedded in new ICT contracts

    Open data and open platforms need to be embedded into the procurement process. We need to break the straightjacket of public services being sold into proprietary IT contracts where the public body isn’t able to use their own data beyond the purposes originally specified. There also has to be a more intelligent procurement process where seemingly value for money initial cost is impacted by costly process of upgrading

    1. What is the best way to achieve compliance on high and common standards to allow usability and interoperability?

    There are a number of standards that are open, extensible and interoperable.

    2. Is there a role for government to establish consistent standards for collecting user experience across public service

    Government is the only authority that can establish compliance amongst public bodies.

    3. Should we consider a scheme for accreditation of information intermediaries, and if so how best that might work

    No. As long as there is equal access to data for all the market should be able to create the right mechanism.

    1. How would we ensure that public service providers in their day to day decision-making honour a commitment to Open Data, while respecting privacy and security considerations.

    There needs to be an establishment of a robust data release framework where sensitive data would be identified at an early stage. There also needs to be an honest position with regard to this where data collectors don’t combine data so that it can then be covered by the DPA or their might be

    2. What could personal responsibility at Board-level do to ensure the right to data is being met include? Should the same person be responsible for ensuring that personal data is properly protected and that privacy issues are met?

    Corporate responsibility a board level

    3. Would we need to have a sanctions framework to enforce a right to data?

    Yes, change cant happen without sanction

    4. What other sectors would benefit from having a dedicated Sector Transparency Board

    We think that the duplication of task is unnecessary when you have a common and clear sets of standards

    1. How should public service make use of data inventories? What is the optimal way to develop and operate this?

    Data inventories should that serve the purposes of both internal purposes and external purposes

    2. How should data be prioritised for inclusion in an inventory? How is value to be established

    Ideally there should be identification of a common set of ‘high value’ datasets that will help to embed the validity of open data these will also help to create a first wave of interpretations and applications. An implementation of a common data release plan would then be undertaken.

    3. In what areas would you expect government to collect and publish data routinely?

    All areas

    4. What data is collected ‘unnecessarily’? How should these datasets be identified? Should collection be stopped?

    There is a great deal of duplication of data within the public sector and this needs to be minimised. Careful consideration should be given as to what unnecessarily actually means. If it means that the data isn’t being used.

    5. Should the data that government releases always be of high quality? How do we define quality? To what extent should public service providers ‘polish’ the data they publish, if at all?

    You would expect that data that is collected on the public’s behalf for the delivery of public service should be of high quality and if it isn’t there is something that is wrong with the system. Although it might be necessary to anonymise or redact certain data this should only be undertaken in tightly defined cases.

    1. How should government approach the release of existing data for policy and research purposes: should this be held in a central portal or held on departmental portals?

    Ideally all the data should be held on the same portal so that there is no need to search for it

    2. What factors should inform prioritisation of datasets for publication, at national, local or sector level?

    High value quality of life datasets. Should always be identified. The quick wins

    3. Which is more important: for government to prioritise publishing a broader set of data, or existing data at a more detailed level.

    Data should be released at the source resolution. Additional work to create different resolutions of data should be discouraged

    1. Is there a role for government to stimulate innovation in the use of Open Data? If so, what is the best way to achieve this?

    Definitely. For Open Data to become sustainable we need to be able to not only encourage the supply side but that of the demand side as well. Where market failure occurs or where there is nascent development of a sector then there is a need to stimulate activity to drive awareness, create services and applications and develop a base layer from which further development niacin be derived.