STIP Monitoring Tool - Frequently Asked Questions

What is the EC-OECD Science, Technology and Innovation Policy (STIP) database?
The EC-OECD Science, Technology and Innovation Policy (STIP) monitoring tool systematically collects in a harmonised way quantititative and qualitative data on national science, technology and innovation (STI) policies. It addresses all areas of STI policy, including initiatives spread across different ministries and national agencies, with competence over domains as broad as research, innovation, education, industry, environment, labour, finance/budget, among others.
 
The monitoring tool is based on the 2017 STIP survey and was implemented to allow countries to update the information provided in the survey on a continuous basis. The database is unique in its scope, nature and scale and has become a major large-scale international tool to monitor and analyse countries’ science and innovation policies.
How will the data collected through the STIP monitoring tool be made accessible and used?
The STIP database provides unique information that can feed into countries’ strategic, benchmarking and policy mapping initatives. The data collected is openly available at STIP Compass, where it can be shared, analysed and visualised, and linked to other data. These database features are enabled by semantic technologies that allow countries to develop powerful queries to respond to their specific needs and questions. Countries can also use standard queries embedded in interactive “dashboards”, which provide visually attractive overviews and details on demand, and enable faceted search and in-situ analysis of data. Countries can easily share and download these charts for their own “home” use.
 
Reported policy initiatives are also an essential input to various OECD and EC projects and publications. They are used notably in the OECD biennial STI Outlook flagship publication, where country information is used to inform policy makers of recent and anticipated changes in global patterns of STI policies. To some extent, STIP data also allows for comparative analyses of STI policies and instruments and national benchmarking of STI policy performance. The database is also an important source of information for the European Commission’s detailed analysis of EU countries’ plans for budget, macroeconomic and structural reforms in the framework of the “European Semester”. 
What changed in the 2017 edition of the EC-OECD STIP database relative to previous exercises?
For more than 20 years, the OECD’s Directorate for Science, Technology and Innovation (DSTI), under the aegis of the OECD Committee for Scientific and Technological Policy (CSTP), has conducted biennial “Outlook” surveys of science and innovation policies in its member and strategic partner countries. Throughout this time, the content, format and underlying collection tools of the STIP have evolved significantly. The survey was conducted jointly with the European Commission’s Directorate-General for Research & Innovation (DG RTD) for the first time in 2015.
 
With both organisations pooling resources, the 2017 edition of the STIP survey underwent major revisions, with a view to improving the quality and accessibility of the data collected and to reducing the data collection burden on countries and analysts. Country feedback on the survey’s strengths and weaknesses has been important in this regard. The result is a significantly streamlined and sharpened survey, administered through a new user-friendly online questionnaire. The survey’s results were the first time made available in a new semantic web database, permitting countries to easily retrieve and reuse the data for their own policy purposes. Following the survey, the online questionnaire was extended into a monitoring tool allowing the database to be progressively updated (before the next edition of the STIP survey).
What is the geographic coverage of the EC-OECD STIP database?

The coverage of the STIP database has been increasing throughout the years. The 2017 edition includes 60 participating countries and territories, a record number. This coverage includes several emerging economies (e.g. Brazil, the People’s Republic of China, India, Indonesia, the Russian Federation, and South Africa). Taken together, countries covered by the database account for an estimated 98% of global R&D. The following table provides the full list of countries currently covered.

List of participants in the EC-OECD STI Policy Survey 2017

Argentina

Australia

Austria

Belgium (Federal)

Belgium (Brussels)

Belgium (Flanders)

Belgium (Wallonia)

Brazil

Bulgaria

Canada

Chile

China

Colombia

Costa Rica

Croatia

Cyprus

Czech Republic

Denmark

Egypt

Estonia

Finland

France

Germany

Greece

Hungary

Iceland

India

Indonesia

Ireland

Israel

Italy

Japan

Kazakhstan

Korea

Latvia

Lithuania

Luxembourg

Malaysia

Malta

Mexico

Morocco

Netherlands

New Zealand

Norway

Peru

Poland

Portugal

Romania

Russian Federation

Slovak Republic

Slovenia

South Africa

Spain

Sweden

Switzerland

Thailand

Turkey

United Kingdom

United States

European Union

Several countries are organised along federal lines, with significant STI policy initiatives designed and implemented at the sub-national level. Despite this, the data should reflect only initiatives at the national level, even if this means a portion of a country’s STI policy initiatives remain unreported. The single exception is Belgium, where the 2017 survey edition is experimenting with collecting data at national and sub-national levels. 

What is the thematic coverage of the EC-OECD STIP database?

The STIP database adopts a wide definition of STI policies, ranging from the traditional public research and business innovation policy focus, to all public policy intiatives that aim to support, for example, social innovation, environmental innovation, innovation in developing countries, etc. The database covers this broad scope under six themes, each of which includes several questions (see figure below). Horizontal issues such as the participation of minorities in research and innovation, or multistakeholder engagement in STI policy making, are addressed through specific questions within these themes. 

In addition to these six “core” themes, the 2017 edition of the STIP survey included two “modules”: one on STI policies related to “digitalisation” and another on European Research Area-related initiatives (for EU and associate members only).

What are the digitalisation and ERA “modules” and why have they been included in the database?

The 2017 STIP survey included a “core” survey and additional “modules”. The core survey is made up of six themes (see question on thematic coverage), whose questions will remain largely unchanged from one survey exercise to the next. This stability provides a basis for comparison over time of countries’ STI policies. Modules are, by contrast, one-off or less regular themes that include questions related to ongoing OECD or EC projects and interests. They provide flexibility, allowing the survey to ask questions on newly emerging and hot topics. The 2017 edition of the survey included two such modules, one dedicated to ERA-related initiatives, to be completed only by EU member and associate countries, and another on STI policies related to “digitalisation”. The latter reflects the prominence of this theme in the 2017-18 programme of work and budget of the OECD Committee for Scientific and Technological Policy (CSTP) and its working parties.

How do I access my country’s database?

A unique URL for each country was included in the invitation email sent to the database's national contact points. The URL is deliberately difficult to remember, acting as the password protection to limit access to the site. The URL should therefore not be shared, just like a private password would not be shared. National contact points can create additional accounts that are associated to similarly unique, private URLs that grant access to other respondents and allows them to edit information in the corresponding country's database.

Why are there already data in my country’s monitoring tool?
To reduce the burden on respondents, countries’ databases have been pre-filled as far as possible with curated responses provided in previous survey rounds.  
 
Since the 2017 edition of the survey has been significantly modified, the OECD Secretariat has developed a correspondence matrix with the previous edition to automatically transfer the data into the new survey structure. The OECD Secretariat recognises that this automatic prefilling might lead to some misallocation of initiatives to questions. To remedy this, respondents should review the list of policy initiatives already assigned to each question. Referring to the question prompts, respondents should consider whether any policy initiatives have been inappropriately assigned to the question. If this is the case, they should click on the “Tools” button to the left of the policy initiative’s title and select “unlink”. This will unlink the policy initiative from the question and it will no longer be listed against the question. The policy initiative is, however, still in the system and can be linked to other questions. In other words, it is not lost, even after it has been unlinked.
 
Some of the policy initiatives assigned to a question may have been discontinued since the time of the last survey in 2015. If this is the case, then the respondent should click on the “Tools” button to the left of the policy initiative’s title and select “ceased”. The monitoring tool will automatically assign the current year as the end year. If the initiative ceased before the current year, for example, in 2016, the respondent should indicate this change in the policy initiative.
 
Finally, note that policy initiatives are colour coded in the main interface to indicate whether they are prefilled from the previous survey round (blue). new (green) or ceased (grey).
How are new policy initiatives added to a question in the monitoring tool?
To assign new policy initiatives to a question, respondents should click on “Add new initiative” in the answer field. This field includes a predictive search feature. If when typing the name of the policy initiative the predictive search recognises it, this means that it has already been entered into the system as an answer to another question, either by another respondent in your country or as part of the prefilling with data from the 2015 survey (see FAQWhy is there already data in my country’s database?”). The respondent can then decide to link it to the question, simply by clicking on the green “Link” button. Alternatively, the respondent can decide to look at the initiative to see if its details remain accurate. In this case, the respondent should click the “Edit” button to check and update the fields as needed. 
 
If, on the other hand, the name is not recognised, a new policy initiative will be created with a new form to fill out. Respondents will need to complete the fields in the form, as described in the FAQ “What information should be provided for each policy intiative?”.
 
Finally, if it is known that a policy initiative already entered into the database as an answer to one question should also be included as an answer to another, the monitoring tool’s “link” functionality can be used. This avoids having to enter the same information into the tool multiple times. As already described above, respondents should click on “Add new initiative” in the question’s answer field and start to type the policy initiative’s name. When the predictive search feature recognises it, the name of the policy initiative will appear and can be selected. It can then be linked to the question by clicking on the green “Link” button. This can also be reversed in case of mistakes by clicking on the “Tools” button to the left of the policy initiative’s title and selecting “unlink”.
The database asks countries to report their policy initiatives, but what exactly is a “policy initiative”?
The database’s questions ask respondents to provide information on relevant STI “policy initiatives”. Drawing partly on an EU definition, an STI policy initiative can be defined as a public action that i) aims to achieve one or several public policy goals in the policy area of science, technology and innovation, ii) is expected to modify or frame the behaviours of actors and stakeholders, being national, domestic or foreign, who are part of or influential on, the national innovation system, and iii) is implemented with a minimum time horizon or on a continuous basis (i.e. not as a one-off “event”). 
 
The definition used in this database is therefore deliberately wide: a policy initiative can be a financial policy measure (e.g. a grant, a tax incentive, etc.), a programme (e.g. environmental technology programme led by an environment agency, a cross-border research programme, etc.), a law or regulation (an evaluation/impact assessment requirement applying to the STI area, etc.), an informal framework (e.g. an indicative rule or guideline on stakeholder consultation on research priorities or on minorities’ inclusion) or an ‘institutional event’ (e.g. creation during the last two years of a research agency, a high level STI council, etc.).
 
A policy initiative may target different groups and may utilise one or more different policy instruments in its implementation  (see FAQ “What is a policy instrument?”).
 
At all stages, respondents can refer to the monitoring tool’s prompts in order to access useful information on what are considered valid policy intiatives for each question. Prompts exist for each question as well as for many of the fields used in the monitoring tool.
What should be the scope of a policy initiative?
A crucial issue in reporting in the monitoring tool relates to the scope of what a country considers to be a “policy initiatives”. There is no unique solution or golden rule and it is up to national authorities to decide how and what to report as policy initiatives in their respective STI systems. In doing so, respondents should consider the following trade-off: 
 
  1. Policy initiatives should not be too broad in scope, otherwise it will be difficult to provide sufficiently precise information on them (e.g. on their budgets, start dates, objectives, etc.) in the monitoring tool. For instance, a large STI framework law/plan encompassing several policy measures/instruments of a different nature would certainly be too large (the law or plan, however, should be reported under the theme on ‘governance’, for the question about national STI plans/strategies; and each of the plan’s/strategy’s policy initiatives reported separately under the relevant themes and questions).

  2. At the other extreme, dedicating a policy initiave to each thematic research programme of a national funding agency would be too specific, since their objectives and modes of implementation would be largely similar, and only the theme in which they operate would be different. In this case, a good approach might be to “group” these thematic research programmes into a unique initiative (named for instance “thematic research programmes of the national funding agency”). Research programmes with clearly distinct objectives and/or modes of implementation (e.g. a programme with a stronger user-led approach, or an exploratory research programme, etc.), if any, could still be reported 

To illustrate with a hypothetical example, a funding agency may have a portfolio of 20 different grants. Of these, 12 may seek the promotion of business R&D and share similar beneficiaries, meaning they could be grouped together under the same initiative. Another subset of 7 grants could share the common objective of fostering basic research in the public sector and grouped under a different policy initiative. Finally, it may be the case that one of the grants has a comparatively higher budget and aim particularly to encourage collaborative research between business and academia. In this case, a separate policy initiative could capture this unique type of grant.

What information should be provided for each policy initiative?
Respondents should provide information on each policy initiative through a standard form, which is accessed by clicking on the name of the policy initiative. The form has several open text and scroll-down fields, as follows (mandatory fields are highlighted with a red star):
  • Name in English *
  • Name(s) in original language
  • Acronym (if any)
  • Start date (year)
  • End date (year)
  • Short description (1 sentence) *
  • Objective(s) *
  • Background and shifts in policy
  • Type(s) of policy instrument(s) * (scroll-down list, multiple choice possible) (see FAQ “What is a policy instrument?”)
  • Direct beneficiaries * (scroll-down list, multiple choice possible)
  • Name of responsible organisation(s) * (features predictive search to avoid having to enter the same information more than once)
  • Estimated budget * (provide either EUR per year, selected through a scroll-down list of budget ranges, single choice only, OR Budget amount per year in national currency)
  • Internet link(s)
  • Evaluated (yes/no) *
  • Link(s) to evaluation(s)
 
There are a few things respondents should look out for in some of these fields:
 
First, the “Direct beneficiaries” field asks respondents to identify those groups that are the direct target of the policy initiative, e.g. those that receive financial support or who are directly subject to a law or regulation. It is not untypical for a policy instrument to have more than one direct beneficiary, so the field allows respondents to make multiple selections. The list of direct beneficiaries used in the monitoring tool are in the table below.
 

Category

Direct beneficiaries

   

Researchers, students and teachers

 
 

Established researchers

 

Post-doctoral researchers

 

Undergraduate and master students

 

Secondary education students

 

PhD students

 

Teachers

   

Social groups especially emphasised

 
 

Women

 

Disadvantaged and excluded groups

 

Civil society

   

Capital and labour

 
 

Private investors

 

Entrepreneurs

 

Labour force in general

 

Workers with tertiary education and above specifically

   

Research and education institutions

 
 

Higher education institutes

 

Public research institutes

 

Private research and development lab

   

Governmental entities

 
 

National government

 

Subnational government

   

Firms by age

 
 

Firms of any age

 

Nascent firms (0 to less than 1 year old)

 

Young firms (1 to 5 years old)

 

Established firms (more than 5 years old)

   

Firms by size

 
 

Firms of any size

 

Micro-enterprises

 

SMEs

 

Large firms

 

Multinational enterprises

   

Intermediaries

 
 

Incubators, science parks or technoparks

 

Technology transfer offices

 

Industry associations

 

Academic societies / academies

 

Second, the “Name of responsible organisation(s)” field, which asks about the ministries or agencies responsible for funding and managing the policy initiative, includes a predictive search function. Accordingly, if a particular ministry or agency has already been entered elsewhere in the monitoring tool, this will appear automatically as you type, saving you the trouble of having to retype all the information again. This feature also reduces the likelihood that the same organisations will be reported in the monitoring tool using multiple different names, which harms later analysis of the data. You can enter several responsible organisations for any given policy initiative.

Third, in the “Estimated budget” field, you can either use the scroll-down “budget range in Euros” field to select a budget range (which is useful when you do not know the precise budget) or else use the text field below to type the amount in your country’s currency. The Euro budget ranges used in the monitoring tool are listed below.

  • Less than 1M
  • 1M-5M
  • 5M-20M
  • 20M-50M
  • 50M-100M
  • 100M-500M
  • More than 500M

Besides these ranges, you may select the option “Don’t know” when neither a range nor a precise amount are known. You may also select the option “Not applicable” if the policy initiative does not have a dedicated badget (for instance, the reform of a public body such as an innovation agency).

Finally, note that the policy instrument field has some advanced features (see FAQ “What is a policy instrument?”). 

When all the fields have been completed in the policy initiative form, including for policy instruments, respondents can hit the “Save as complete” button. The policy initiative will then be assigned a green tick mark on the right side indicating its status as completed. If some mandatory fields have yet to be completed, the system provides a warning. If a respondent is unable to complete all of the fields and would like to return to the form later, they can also save the information they have entered using the “Save as draft” button. The policy initiative is then assigned a red empty tick box on the right side indicating its status as draft.

Even though if an initiative has been saved "as complete", you can still access it to make editions or even change it draft mode. What is important to keep in mind is that the aim is for all initiatives to be "saved as complete".

What is a policy instrument?

Among the fields used to characterise a policy initiative is one dedicated to the policy instruments the initiative uses in its implementation. The monitoring tool incorporates a list of policy instruments for respondents to choose from, as shown in the table below.

On clicking “Add a new instrument” in the policy initiative form, a mini form opens up in a new window. Here, respondents are prompted to select one of the policy instruments from a scroll-down list. On selecting a policy instrument, further fields appear in the mini form, specific to the policy instrument selected, inviting the respondent to provide more details. These fields are, for the most part, multiple choice mandatory questions. Once all mandatory fields are completed, the mini form should be saved. The type of policy instrument selected is then added to the policy initiative. 

Since many policy initiatives use a mix of policy instruments, respondents can add as many policy instruments as needed, repeating the steps just outlined.
 

Instrument category

Policy instruments

Direct financial support

Institutional funding for public research

Project grants for public research

Grants for business R&D and innovation

Centres of excellence grants

Procurement programmes for R&D and innovation

Fellowships and postgraduate loans and scholarships

Loans and credits for innovation in firms

Equity financing

Innovation vouchers

Indirect financial support

Corporate tax relief for R&D and innovation

Tax relief for individuals supporting R&D and innovation

Debt guarantees and risk sharing schemes

Guidance, regulation and other incentives

Technology transfer and business advisory services

Labour mobility regulation and incentives

Intellectual property regulation and incentives

Science and innovation challenges, prizes and awards

Collaborative platforms and infrastructure

Clusters and other networking and collaborative platforms

Dedicated support to new research infrastructures

Information services and databases

Governance

National strategies, agendas and plans

Creation or reform of governance structure or public body

Policy intelligence (e.g. evaluations, reviews and forecasts)

Formal consultation of stakeholders and experts

Horizontal STI coordination bodies

Standards and certification for technology development and adoption

Public awareness campaigns and other outreach activities

Who should be contributing to the monitoring tool in my country?

The feedback from STIP database national contact points (in most cases CSTP and/or ERAC delegates) has revealed the diversity of practices in how countries contribute data. Many countries reply in a decentralised way, with several parts of the government taking responsibility for completing different questions in the monitoring tool. Other countries take a more centralised approach, with the national contact point taking the lead in collecting information and inputting this into the monitoring tool.

It is the responsibility of the national contact point in each country to decide on and coordinate the contributions of different respondents. National contact points should bear in mind that:

  • Experience points to a strong positive correlation between the number and institutional diversity of people involved on the one hand, and the quality and density of the data on the other.
  • The online monitoring tool specifically developed to update the data submitted in the 2017 edition of the survey makes it easy to decentralise data collection. Several respondents can work at the same time on the database (see FAQ "How do I access my country’s monitoring tool?”. Moreover, the tool’s thematic structure supports an easy division of work, for instance, between different policy bodies.
Can several respondents work on the monitoring tool at the same time?

The online tool specifically developed to update the data submitted in the 2017 edition of the survey makes it easy to decentralise data collection. Several respondents can work on the database at the same time, the only limitation being that they cannot work on the same initiative in parallel (an unlikely occurrence, but as a safeguard, the monitoring tool is set to provide a warning message if such a situation arises).

How can data on the monitoring tool be printed?

It is possible to print data in the monitoring showing the policy initiatives mapped against the questions, be they active or ceased. This can be done by simply clicking on the printer button on the top of the page. This is a useful feature that provides an overview of the database’s completeness – policy initiatives that are completed have a green tick-box assignd to them, those that are still in draft form have a red empty tick-box. The monitoring tool also supports the full printing of the data entered in policy initiatives. 

How can I get additional support if I need more information?
You can access access all information related to the STIP database online at http://bit.ly/OECDSTIP including an overview of its structure, respondent’s video tutorial and help desk.
 
Should you have any questions, please do not hesitate to contact the OECD Secretariat at STIPolicy.data@oecd.org