Methodological approach Clause Samples

Methodological approach. Econometric estimation of total factor productivity (TFPQ) at firm level. Subsequent estimation of the causal effect of import competition on firm productivity using econometric techniques.
Methodological approach. The neuGRID project is focused on setting up “a grid-based e-infrastructure for data archiving/communication and computationally intensive applications in the medical sciences”. The exploitation of the developed infrastructure for the exchange of imaging and clinical data has been assured by a focused dissemination strategy, ensuring effective collaboration both within the project and with communities external to it, and coordinating neuGRID with related projects and activities carried out in Europe and elsewhere. The dissemination strategy has as its main objectives: • To disseminate project results to the relevant scientific communities; • To raise awareness at the political and decision-making levels of the opportunities offered by neuGRID; • To spread within research-, academic- and clinical communities knowledge about the facilities and tools supplied by the infrastructure; • To assess the regulatory needs of the pharmaceutical industry for pre-competitive research and clinical trials including clinical trial registration, agreements that should be prepared and signed by potential industry users, IPR management, and regulations for data ownership, exchange, and analysis; to define the adaptations or expansions of the present infrastructure to host industry pre-competitive research and randomized clinical trials with clinical and imaging/biological surrogates; and to define a set of activities that should be carried out to make neuGRID compliant with industry needs; • To promote compatibility of neuGRID with related initiatives that are being carried out in North America, Japan, and Australia; • To promote integration into neuGRID of the most popular tools for brain image analyses to carry out high performance grid computing by international researchers on own or merged datasets; • To spread infrastructure aims and services to be exploited in the daily research and clinical practice; • To teach potential users how to use the implemented services through the provided GUI; • To teach research users how to take advantage by the high performance computing facilities. There are several possible channels for disseminating information and results about neuGRID. The selection of modalities and ways varies in relation to the communication targets. As detailed in the Dissemination and training plan, during all the 36 months of the project the dissemination activities have included: conferences, teleconferences, meetings, workshops, letters of intent, emails, art...
Methodological approach. The structure and crystallinity of the zeolites were determined by X-ray powder diffraction using a Bruker AXS D8 Advance diffractometer equipped with a graphite monochromator and a position sensitive detector Våntec-1 using CuKα radiation in ▇▇▇▇▇–▇▇▇▇▇▇▇▇ geometry. Nitrogen adsorption/desorption isotherms were measured on a Micromeritics GEMINI II 2370 volumetric Surface Area Analyzer at -196 °C to determine surface area, pore volume and pore size distribution. Before the sorption measurements, all samples were degassed in a Micromeritics FlowPrep 060 instrument under helium at 300 °C (heating rate 10 °C/min) for 4 h. The specific surface area was evaluated by BET method using adsorption data in the range of a relative pressure from p/p0 = 0.05 to p/p0 = 0.25. The t-plot method was applied to determine the volume of micropores (Vmic). The adsorbed amount at relative pressure p/p0= 0.98 reflects the total adsorption capacity (Vtot). The concentration and the type of acid sites were determined by adsorption of acetonitrile as a probe molecule followed by FTIR spectroscopy (Nicolet 6700 FTIR with DTGS detector) using the self- supported wafer technique. Prior to adsorption of the probe molecule, self-supported wafers of zeolite samples were activated in-situ by overnight evacuation at temperature 450 °C. CD3CN adsorption proceeded at room temperature for 30 min at equilibrium pressure 5 Torr, followed by 30 min degassing at room temperature. To obtain quantitative analysis, the molar absorption coefficients for CD3CN adsorbed on Brønsted acid sites (ν(C≡N)-B at 2297 cm-1, ε(B) = 2.05 ± 0.1 cm μmol-1) and strong and weak ▇▇▇▇▇ acid sites (ν(C≡N)-L1 at 2325 cm-1 ν(CN)-L2 2310 cm-1, ε(L) = 3.6 ± 0.2 cm μmol-1) were used. Integral intensities of individual bands were used and spectra were normalized to the wafer thickness 10 mg cm-2. The Iso-Therm thermostat (e-Lab Services, Czech Republic) maintaining temperature of the sample with accuracy of ± 0.01 K was used for the measurement of carbon dioxide adsorption at temperatures from 273 K to 333 K. After argon adsorption measurement, adsorption isotherms of CO2 were subsequently recorded on the same sample at temperatures 273 K, 293 K, 313 K and 333 K. The exact temperature was determined using a platinum resistance thermometer. Zeolites were degassed before each measurement at 473 K (temperature ramp of 1 K min-1) under turbomolecular pump vacuum overnight.
Methodological approach. We first checked the pressure and temperature conditions of operations and sampling at the CarbFix site to ensure they fit within the technical specification range of the PUSH50. Then we targeted a well characterized piezotolerant bacterial strain (which bears mild pressure) with a metabolism typical of those expected for instance at the CarbFix site, thanks to the earlier work done by the group at IPGP (Paris).
Methodological approach. 2.1. Robust methodology and general purpose software for the study of porosity
Methodological approach. As indicated in the previous sections, the present deliverable refers to three field sites. The methodological approach followed for each site has been practically the same, although with some differences induced by their specific characteristics. The first step has been the collection of geological and hydrogeological data on a regional scale to design the site stratigraphy, identify the SA and its groundwater level. Data have been collected from scientific papers, available maps, and national databases. Subsequently, the analysis has been more focused on a local scale. More detailed information has been acquired from the people/companies responsible of the G-ER exploitation and/or the environmental quality monitoring of each field site, as enabled by the S4CE consortium. Whenever possible, measurements have been performed directly on site. According to the collected data, for each of three sites the Conceptual Circulation Model (CCM) and the Numerical Circulation Model (NCM) of the Groundwater (GW) have been realized. The level of details of the three models is different because of existing differences in terms of available and reliable datasets. Several simplifications have been performed especially for the Cornwall site, because of the lack of reliable data. More detailed information about models construction and characteristics is reported hereinafter.
Methodological approach. This study uses an exploratory, sequential mixed-methods design (▇▇▇▇▇▇▇▇ & ▇▇▇▇▇ ▇▇▇▇▇, 2018; ▇▇▇▇▇▇▇ & ▇▇▇▇▇▇▇▇▇▇, 2009). Mixed methods research brings qualitative and quantitative approaches together to leverage the advantages of both approaches and, as a result, provide a more complete understanding of an issue than either method yields alone (▇▇▇▇▇▇▇▇ & ▇▇▇▇▇ ▇▇▇▇▇, 2018; ▇▇▇▇▇▇▇ et al., 2007; ▇▇▇▇▇▇▇ & ▇▇▇▇▇▇▇▇▇▇, 2009). We integrate qualitative and quantitative approaches throughout the study’s design, data collection, and analysis phases. In the study’s first phase, we conducted interviews with CSA directors and staff from a subset of counties, to provide a foundational understanding of how CSAs work with NCPs having issues with employment, and how and where they connect these NCPs for support. We used findings from these interviews to inform the study’s second phase: a survey of Wisconsin CSA directors. Information from interviews was used to refine survey topics of interest, questions, and response categories. Quantitative findings resulting from the survey are the primary focus of this report, and we augment the survey results with additional qualitative findings that shed deeper insights into the survey responses. Findings are organized by topic, with qualitative and quantitative findings woven together and discussed throughout each topic’s narrative (▇▇▇▇▇▇▇ et al., 2013). All study activities were approved and overseen by the University of Wisconsin−Madison’s Institutional Review Board.
Methodological approach. The methodological approach used in the study is summarised in the diagram shown below:
Methodological approach. For measuring robot exposure at the regional level, we follow the approach by ▇▇▇▇▇▇▇▇ and ▇▇▇▇▇▇▇▇ (2020). For given national changes in robot adoption at the industry level, this approach assigns stronger automation exposure to regions that were historically specialized in industries for which robot adoption has later been more substantial. We combine data on Deliverable D5.5 Version 1.0 the adoption of industrial robots at the country-industry level, sourced from the International Federation of Robotics, with regional employment data, sourced either from Eurostat or from national sources. For measuring individual exposure to automation, we develop in the paper a novel methodology. In particular, in order to capture the individual exposure to automation in a way that is not contaminated by the consequences of automation itself, we do not use information on the current occupation. Instead, we employ a vector of predicted probabilities for each individual to be employed in each occupation. Crucially, these probabilities are estimated based on individual characteristics and on the pre-sample, historical composition of employment at the occupation level in the region of residence. The individual vulnerability to automation is then obtained as the scalar product between this vector of probabilities and a vector of automatability scores of the occupations. In other words, the vulnerability score is a weighted average of the automatability scores for each occupation, where weights are the probabilities of employment of each individual in each occupation. To obtain the individual exposure to automation at the time of a given election, the vulnerability score is further interacted with the pace of robot adoption in the specific country and election year. Intuitively, for a given national pace of robot adoption, our measure of individual exposure assigns higher scores to individuals that would have been more likely – in the pre-sample historical labour market – to work in occupations whose automatability is higher. The logic of the individual measure is analogous to the one underlying the regional measure in Acemoglu and ▇▇▇▇▇▇▇▇ (2020). In that case, the vulnerability of a region is determined by its historical sectoral composition. In this case, the vulnerability of an individual is determined by the historical distribution of occupations in her labour market, in conjunction with her observable characteristics. We employ historical labour market data from the Eu...
Methodological approach. As a conceptual framework, we use an extended version of a state-of-the-art property rights model of sequential value chains that allows for intangible assets and intellectual property rights (IPR) protection. We then test our extended model's predictions through probit regressions on Slovenian firm-level and transaction-level data.