≡ Menu

The popular website KildareStreet.com, was effectively shut-down earlier this month by changes to data feeds on the Irish Government’s Oireachtas website. The independent website – created and run by John Handelaar – utilized public data sets to provide a convenient and searchable archive of everything said in the Dáil [Irish Parliament] since January 2004, and Seanad [Ireland’s Upper House] since September 2002.

While the site was hugely popular – receiving over 570,000 unique visitors in the year to September (a third of these from Irish government addresses) – it has been unable to update its archive of data since the beginning of the new Dáil term on September 18th. The reason is down to changes in how the Houses of the Oireachtas publishes its XML feeds, and raises questions about the Government’s commitment to Open Data.

According to a statement by John Handelaar of Kildarestreet:

On September 18th, 2012, with no warning or published statement of intent, a significant change to the Houses of Oireachtas website housing the public record of Dail and Sinead debates was made, effectively killing KildareStreet.com for the foreseeable future.

It appears that such changes were made to achieve efficiencies through the removal of a layer of outsourcing. Ryan Meade’s blog on the debacle sums up the rational:

Mark Mulqueen, Head of Communications for the Oireachtas, confirmed to me on Twitter that the recent changes to the site were designed to achieve efficiencies by ending the outsourcing of “a large amount of work involved in debates. That’s where a saving arises.” I asked him if this meant that Propylon were no longer managing the debate records and he replied, “Yes, I can confirm that to be the case. Using existing resources we will provide access to debates more quickly”.

The consequences of these changes (which were implemented without prior warning), however, were to kill the data feeds that KildareStreet relied on. In order to recover from this loss of data, KildareStreet have embarked on a 2 week fundraising campaign to assist with a new edition of the website to cater for the Government’s XML design changes (and use of the proprietary and dated Lotus Notes platform).

This rebuilding effort is expected to take several weeks to design and implement.

It is worth noting, however, that the changes to the Oireachtas website do not appear to have gone without incident. Simon McGarr notes (and I can personally testify to the following):

The search doesn’t work and never did. You can’t link to any particular part of a debate. You can’t look for contributions by a particular Oireachtas member. Basically, you can’t do anything you could possibly imagine you might actually want to use a record of the Oireachtas debates for.

KildareStreet in contrast has a very intuitive user interface and allows for email alerts when various key words are spoken in parliament. It has operated since 2009 on €5,800 raised from donors and does not receive any government funding. It’s a supremely efficient and effective service and deserves public support.

To donate, goto kildarestreet.com/zombies

For more check:


Clay Shirky’s recent TED talk, “How the Internet will (one day) transform government,” is a powerful assessment of how governments can utilize the power of the Internet to lower the costs of doing things. He investigates the government’s role in the coordination of collective effort, and how the intrinsic features of the web can mean the dramatic lowering of costs facilitating big changes.

Via TED:

The open-source world has learned to deal with a flood of new, oftentimes divergent, ideas using hosting services like GitHub — so why can’t governments? In this rousing talk Clay Shirky shows how democracies can take a lesson from the Internet, to be not just transparent but also to draw on the knowledge of all their citizens.


The UK Cabinet Office’s behavioural insights team recently published a guidance document on the topic ‘Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials‘. The paper describes how the government should be more rigorous in assessing the impact of its policies to make sure they’re effective, deliver value for money and represent the best use of government resources.

Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials

The central thesis of the guidance is that ‘Randomised controlled trials (RCTs) are the best way of determining whether a policy is working’ and these should be used more to test the effectiveness of public policy interventions. The report says random trials could be applied to “almost all aspects of public policy”. It recommend starting the trials in uncontroversial areas (see Reducing fraud, error and debt) – such as the wording on tax reminders (one trial suggested different wording could improve effectiveness by around 30%)  – before working up to more contentious social issues.

9 Key Steps to Conducting a Randomised Control Trial

The paper then goes on to outline in detail the 9 separate steps required to setup any Randomised Control Trial. These steps represent the Behavioural Insights Teamʼs ʻtest, learn, adaptʼ methodology, and focus on understanding better what works and continually improving policy interventions to reflect learning from different trials.

Test – (ensuring you have put in place robust measures that enable you to evaluate the effectiveness or otherwise of the intervention)

1) Identify two or more policy interventions to compare (e.g. old vs new policy; different variations of a policy).
2) Determine the outcome that the policy is intended to influence and how it will be measured in the trial.
3) Decide on the randomisation unit: whether to randomise to intervention and control groups at the level of individuals, institutions (e.g. schools), or geographical areas (e.g. local authorities).
4) Determine how many units (people, institutions, or areas) are required for robust results.
5)  Assign each unit to one of the policy interventions, using a robust randomisation method.
6) Introduce the policy interventions to the assigned groups

Learn – (is about analysing the outcome of the intervention, so that you can identify ‘what works’ and whether or not the effect size is great enough to offer good value for money)

7) Measure the results and determine the impact of the policy interventions.

Adapt – (using learnings to modify the intervention, that that you are continually refining the way in which the policy is designed and implemented)

8) Adapt your policy intervention to reflect your findings.
9) Return to Step 1 to continually improve your understanding of what works.


Along with providing detailed steps for how to create a Randomised control Trial, the document outlines a series of examples to illustrate the benefits that can occur from such systematic analysis of public policy interventions.

One such example concerns a controlled experiment to study the impact of three schemes to support incapacity benefit claimants in England: support at work, support focused on their individual health needs, or both. The experiment found that while the extra support cost £1400 on average it found no benefit over the standard support that was already available. Thus, the RCT ultimately saved the taxpayer millions of pounds as it provided unambiguous evidence that the additional support was not having the intended effect.

These kinds of findings provides important feedback for policy-makers who can then search elsewhere for effective solutions to practical problems.

Debunking the myths

Randomised trails are common practice in many areas of development and science, but are used nearly universally as a means of assessing which of two medical treatments works best. This was not always the case, however, and when they were first introduced in medicine they were strongly resisted by some clinicians who believed that personal expert judgement was sufficient to decide if a particular treatment was effective.

The myths around RCTs primarily focus on four areas:

1) We don’t necessarily know ‘what works’ – the paper counters this by explaining how RCTs can still be worthwhile in quantifying the benefit and which aspects of a program have the greatest effect. We should be willing to recognise that confident predictions by policy experts often turn out to be incorrect and RCTs demonstrate where  interventions which were designed to be effective were in fact not.

2) RCTs don’t have to be expensive – Costs involved very much depend on how the RCT is designed and with planning they can often be a cheaper than other forms of evaluation.  This is particularly true when a service is already being delivered, and when outcome data is already being routinely collected. Rather than focusing on what a RCT costs to run, the paper suggests, it might be more appropriate to ask: what are the costs of not doing an RCT?

3) RCTs can be unethical – There is often objects to RCTs on the basis that it is unethical to withhold a new intervention from people who could benefit from it. This is particularly true in terms of spending on programmes which might improve health, wealth or educational attainment by one group. They note how we can not be certain of the effectiveness of an intervention until it is tested robustly and sometimes interventions which were long believed to he effective have turned out to be ineffective or even harmful on further experimentation.

4) RCTs do not have to be complicated or difficult to run – The paper notes that ‘RCTs in their simplest form are very straightforward to run.’ The paper explains in great detail the pitfalls and provides much advice on the steps needed – as briefly outlined above – to create such a trial.

The gathering of evidence to support public policy is increasingly important when government resources are stretched. There is a huge responsibility on public policy leaders to show the effectiveness of their projects and ensure value for money. Only those services delivering proportionate value for money should be funded, while those programmes failing to deliver evidence-based results should face reform.

(more at Cabinet Office and guardian.co.uk)



OMB’s ‘Doing What Works’ Memo


OMB M-12-14, Use of Evidence and Evaluation in the 2014 BudgetLate last week, the US Office of Management and Budget (OMB) released new guidance on the use of evidence and evaluation in making budget trade-off decisions. The guidance is encourages agencies to use evidence-based decisions when developing their budgets for Fiscal Year 2014 (the essence of the Doing What Works project from the Center for American Progress).

The OMB memo notes (my emphasis):

Budget submissions also should include a separate section on agencies’ most innovative uses of evidence and evaluation, addressing some or all of the issues below. Many potential strategies have little immediate cost, and the Budget is more likely to fund requests that demonstrate a commitment to developing and using evidence. The Budget also will allocate limited resources for initiatives to expand the use of evidence, including but not limited to approaches outlined below.

OMB’s guidance provides details and examples of existing approaches agencies could pursue:

1) Proposing New Evaluations 

Areas of potential focus outlined by OMB include

  • Low-cost evaluations using administrative data or new technology – the Coalition for Evidence-Based Policy’s recent brief highlights how agencies can often use administrative data (such as data on wages, employment, emergency room visits or school attendance) to conduct rigorous evaluations, including evaluations that rely on random assignment, at low cost.
  • Evaluations linked to waivers and performance partnerships – OMB notes that One of the best ways to learn about a program is to test variations and subject them to evaluation, using some element of random assignment or a scientifically controlled design. This is one of the primary measurement techniques utilized in the UK government’s recent Nudge report into behavioural insights.
  • Expansion of evaluation efforts within existing programs – agencies can add a general policy and requirements favoring evaluation into existing grants, contracts, or waivers (may require additional legislation)
  • Systemic measurement of costs and cost per outcome –  Agencies are encouraged to include measurement of costs and costs per outcome as part of the routine reporting of funded programs to allow for useful comparison of cost-effectiveness across programs.

 2) Using comparative cost-effectiveness data to allocate resources

Through the Pew Charitable Trust’s Results First initiative, a dozen States are currently adopting a model developed by the Washington State Institute for Public Policy (WSIPP) that ranks programs based on the evidence of their return on investment. The model calculated the return on investment to taxpayers from evidence-based prevention and intervention programs and policies. OMB wants such evidence-based programs to be identified so that a comparative analysis of these can “improve agency resource allocation and inform public understanding.”

 3) Infusing evidence into grant-making

OMB outlines how agencies should consider the following approaches to increase the use of evidence in formula and competitive programs:

  • Encouraging use of evidence in formula grants – OMB wants agencies to propose ways to increase the use of evidence-based practices within formula grant programs.
  • Evidence-based grants: Several agencies – several departments have implemented evidence-based grant programs that apply a tiered framework to assess the evidence supporting a proposed project and to determine appropriate funding levels.
  • Pay for Success – OMB notes how the Departments of Justice and Labor will be inviting grant applicants to use a “pay for success” approach, under which philanthropic or private entities (the “investors”) pay providers upfront and are only repaid by the government if certain outcomes are met.

4) Using evidence to inform enforcement

Often rigorous evaluation of strategies for enforcing criminal/environment etc. laws reveal that some approaches are significantly better than securing legal compliance. OMB is encouraging agencies to outline how their allocation of resources among enforcement strategies is informed by such evidence.

5) Strengthening agency evaluation capacity

OMB recommends agencies have a high-level official who is responsible for program evaluation, including developing and managing the agency’s research agenda, along with conducting and overseeing rigorous and objective studies.

OMB Support

As part of the memo, OMB also outlines the support it will provide to agencies to initiative and analyse the roll-out of their evidence-based initiatives. Along with organizing discussions with senior policy officials and research experts, they also plan to “reinvigorate the interagency evaluation working group” in this area.

A lot of the ideas outlined in the memo read like common sense i.e. agencies should measure and evaluate what works through the use of experiments and control groups. These results should then inform policy and decision making. The memo lays out a very specific agenda for program evaluation in agencies and champions the use of structured evaluations to inform future decision-making.  The Center for American Progress and other have provided detailed reports and analysis on how to conduct such experiments and quantify the results. This, along with the interesting work by behavioural insights team at the UK Cabinet Office, should make government work more effectively through formulating policy based on evidence and rigorous evaluation of budget and management decisions. As the memo says:

Where evidence is strong, we should act on it. Where evidence is suggestive, we should consider it. Where evidence is weak, we should build the knowledge to support better decisions in the future.

(h/t The Business of Government)


A report by the UK Cabinet Office’s behavioural insights team published earlier this year, reveals that by applying behavioral insights, the government could reduce fraud, error and debt by billions. It claims that small changes to government processes, forms and language can have a big impact on behaviour and if applied nationally could save “hundreds of millions of pounds”.

The cabinet office team has close links with Professor Richard Thaler, co-author of Nudge: Improving Decisions About Health, Wealth, and Happiness, with the result that many of the ideas espoused in the book are outlined and implemented in the research paper below.

Behavioural Insights Team Paper on Fraud, Error and Debt

Seven simple steps

The report highlights seven simple steps based on evidence from behavioural science that show how “by going with the grain of how people behave, we can reduce the prevalence of fraud, error and debt”

  1. Make it easy: Make it as straightforward as possible for people to pay tax or debts, for example by pre-populating a form with information already held
  2. Highlight key messages: Draw people’s attention to important information or actions required of them, for example by highlighting them upfront in a letter
  3. Use personal language: Personalise language so that people understand why a message or process is relevant to them
  4. Prompt honesty at key moments: Ensure that people are prompted to be honest at key moments when filling in a form or answering questions
  5. Tell people what others are doing: Highlight the positive behaviour of others, for instance that ‘9 out of 10 people pay their tax on time’
  6. Reward desired behaviour: Actively incentivise or reward behaviour that saves time or money
  7. Highlight the risk and impact of dishonesty: Emphasise the impact of fraud or late payment on public services, as well as the risk of audit and the consequences for those caught

Eight trials

In order to explore the effectiveness of the insights above the team adopted a ‘test, learn, adapt’ approach. These case studies included randomised controlled trials (RCTs), which divided the study population into two or more groups and randomly assigning individuals to each of these groups. They then gave the intervention (e.g. a modified letter, a changed process, or a new text message) to one of these groups while continuing to treat the other group as business per usual. Through this they determined the difference in effectiveness of each of the interventions.

This document describes eight separate RCTs, each testing the effectiveness of one or more interventions – in behavioural science – which suggest may be effective in reducing fraud, error or debt.

  • Using social norms: investigates whether informing people that the vast majority of those in their area have already paid their tax can significantly boost payment rates. Result: 15% increase from the old-style control letter which contained no social norm and the localised social norm letters.
  • Highlighting key messages and norms: examines whether we can increase tax compliance among doctors by simplifying the principal messages and actions required, as well as using social levers and norms. Result: 2 new behavioural letters have already resulted in voluntary disclosures (from medics with outstanding tax liabilities) worth over £1 million (with the final total expected to be several million pounds).
  • Using salient images: investigates whether using images captured by DVLA can help to reduce unnecessary repeat correspondence and encourage prompt payment of fines. Result: expected Spring 2012, but early results suggest that a simplified letter and image (of an offending untaxed vehicle) is outperforming the original and simplified letters.
  • Better presentation of information: explores different ways of presenting information to discover which is most effective at encouraging the payment of debts. Result: Preliminary results show the collective and personalised appeals not to overlook the letter (stating that to do so would be treated as an active choice) are significantly more effective (10%) than other forms of letters.
  • Personalising text messages: tests the impact of sending more personalised text messages on people’s propensity to pay court- ordered fines. Result: expected Spring 2012, but preliminary results suggest that texts with the recipient’s name appear to increase, by about 10%, the number of people making a payment.
  • Prompting honesty: examines whether simplifying key messages, emphasising the consequences of fraud and getting people to sign forms upfront results in more honest declarations. Result: expected Spring 2012, but preliminary results indicate a new letter (comprising many of 7 insights above) resulting in 6% reduction in responses to renew Single Person Discount compared with original letter. This is likely to represent a reduction in fraudulent applications.
  • Varying the tone of letters: explores the effectiveness of different types of communication in encouraging plumbers to get their tax affairs up to date. Result: expected Spring 2012.

  • Using beliefs about tax: tests the effectiveness of different messages – related to the fact that most people think that paying tax is the right thing to do – on payment of tax debts by companies. Result:preliminary findings suggest that what seems to be effective is to point out – in letters to companies – any gap between a taxpayer’s belief that ‘businesses should pay their taxes’ and the fact that their own company currently owes tax.


The results of many of the trials indicate the changes in behaviour that can result from highlighting key messages and actions required, presenting information more effectively, and exploring different types of communication:

The insights outlined in this document, applied in a range of different contexts and settings, show that not only is it possible to apply behavioural insights to reduce fraud, error and debt, but also that it can be done in a highly cost-effective way.

The report strikes a cautionary note, however, that not all techniques are feasible. For example, adding a post-it note or a handwritten name to an official communication seems to increase responses, but is not feasible, “given the scale of many government communications”. Thus the effectiveness of interventions will depend heavily on the context in which they are applied:

Policymakers should innovate, but should do so with humility about the limits of current knowledge, and with respect for what is acceptable and helpful to the public whom we serve.

This echos some of the sentiment from the UK House of Lords Science and Technology Sub-Committee’s report last year Behaviour Change in which they found:

…“nudges” used in isolation will often not be effective in changing the behaviour of the population. Instead, a whole range of measures – including some regulatory measures – will be needed to change behaviour in a way that will make a real difference to society’s biggest problems.


  • House of Lords Science and Technology Sub-Committee’s report, Behaviour Change

MerrionStreet.ie – A Cost Overview


A few weeks ago, I lodged an Freedom Of Information (FOI) request with the Department of the Taoiseach for information related to the setup and running of MerrionStreet.ie. I requested the development and running costs of the site since its launch in July 2012.

The site was built for the Government on the WordPress Open Source software platform by Arekibo (an Irish digital media company). The reported cost of the project was €40,000 and the project took about five months to build (from initial RFP to the go-live in mid 2011).

The reason for the request was to see if the use of Open Source software (such as WordPress) can really deliver an efficient cost-effective website, or whether there were any hidden costs associated with its deployment. MerrionStreet.ie is a relatively straightforward website based on the LAMP (Linux, Apache, MySQL, php) stack, and does not require complex interfaces or heightened security. It should therefore serve as a good barometer for whether there is efficient hosting platform available within the Irish Government for such sites.

The costs provided based on the FOI are outlined below and broadly in line with previously released estimates:

Merrionstreet.ie ICT Costs as at 1 Dec 2011
1Development (Includes 5% o/s balance of €907.50 paid April 2011)
2Implementation support costs
3Maintenance and support costs post implementation (2010)
4Independent Security Testing
5Hardware (PC and Laptops)
6Software (Corel VideoStudio Pro X3 licences)
7Audio Video Equipment  – (Cameras, Mikes, Tripod, Autocue Cables etc.)
8Hosting (2010)
9Maintenance and support costs (2011) (excludes €907.50 o/s balance of Development costs paid in April 2011)
10Hosting (2011) Estimated


The FOI response also detailed the following costs:

  • Total running costs for Merrionstreet since inception in July 2010 are € 57,777.35 (as per above € 45,751 in 2010 and an estimated € 12,026 for 2011)
  • Total costs paid to Arekibo to date are €35,204
  • Total staff costs since July 2010 are €76,965. The team that maintains and updates the site content is drawn from staff in the Government Information Service in the Department of the Taoiseach, with the exception of two temporary staff – journalism graduates requiring relevant experience who were recruited at CO level (although one has recently left the Department).

Hosting spend and cost per request

Out of all the costs above, one of the most expensive is the hosting costs estimated to be €4,500 for this year. This is despite the fact that many of the bandwidth intensive objects on MerrionStreet.ie e.g. videos, are stored on third party platforms such as YouTube. The website is currently hosted by the Local Government Computer Services Board and costs are calculated on a cost sharing basis at the end of each year. The costs for 2011 are expected to be in the range of €4,000 to €5,000.

The site has had 784,336 pageviews (see Google Analytics below – requested through FOI) since its launch in July 2010. With hosting charges of an estimated €6,500, this works out at €8.28 per 1000 page views. In comparison to other sites (i.e. based on an analysis of UK based Gov websites), this appears to be high given the site is a static php site hosted on open-source low cost stack software.

There has been 202,116 unique visitors to the site with an average of 2.47 pageviews per visitor. This works out at a cost per 1000 visitors of around €20.45 – again relatively high in comparison to other sites.

Moving to Cloud computing

Full comprehensive data on the cost and analytics for Irish Government websites is not freely available. Many government websites around the world freely release their web statistics (e.g. NY Senate and UK Dept. Business, Innovation and Skills), but there are no examples of Irish central government departments publicly releasing this data.  This data would make a good addition to the Government’s proposed centralised open data portal.

The Government has said it will continue to enhance the use of cloud computing in the public service and a “Cloud Computing Strategy” for the public service is expected to be published at the end of the first quarter 2012. The recent Public Service Reform plan contains a provision to “Seek, through market exercises, to develop a compelling case over traditional computing provision for infrastructure-as-a-service (IAAS) provision for the public service”.

Moving many sites (particularly relatively simple static sites such as those under the remit of the Dept. of the Taoiseach), to a concurrent cloud hosting environment would reduce the hosting expenditure and provide for improved performance for the site users. A cloud infrastructure would reduce costs by allocating resources as the web sites require them, and should be seen as a priority during this period of economic austerity. MerrionStreet.ie is an optimum site to migrate and its corresponding hosting expenditure should be monitored to see if some a move can significantly reduce costs for 2012.

MerrionStreet.ie 20100712-20111219 Analytics Dashboard Report


Page optimized by WP Minify WordPress Plugin

Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported
This work by http://www.rfahey.org is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported.