34 results on '"Henri Fraisse"'
Search Results
2. A Domain-Specific Architecture for Accelerating Sparse Matrix Vector Multiplication on FPGAs.
- Author
-
Abhishek Kumar Jain, Hossein Omidian, Henri Fraisse, Mansimran Benipal, Lisa Liu, and Dinesh Gaitonde
- Published
- 2020
- Full Text
- View/download PDF
3. The Real Effects of Bank Capital Requirements.
- Author
-
Henri Fraisse, Mathias Lé, and David Thesmar
- Published
- 2020
- Full Text
- View/download PDF
4. SAT Based Place-And-Route for High-Speed Designs on 2.5D FPGAs.
- Author
-
Chirag Ravishankar, Henri Fraisse, and Dinesh Gaitonde
- Published
- 2018
- Full Text
- View/download PDF
5. A SAT-based Timing Driven Place and Route Flow for Critical Soft IP.
- Author
-
Henri Fraisse and Dinesh Gaitonde
- Published
- 2018
- Full Text
- View/download PDF
6. Automated extra pipeline analysis of applications mapped to Xilinx UltraScale+ FPGAs.
- Author
-
Ilya Ganusov, Henri Fraisse, Aaron Ng, Rafael Trapani Possignolo, and Sabya Das
- Published
- 2016
- Full Text
- View/download PDF
7. Boolean Satisfiability-Based Routing and Its Application to Xilinx UltraScale Clock Network.
- Author
-
Henri Fraisse, Abhishek Joshi, Dinesh Gaitonde, and Alireza Kaviani
- Published
- 2016
- Full Text
- View/download PDF
8. Analyzing the divide between FPGA academic and commercial results.
- Author
-
Elias Vansteenkiste, Alireza Kaviani, and Henri Fraisse
- Published
- 2015
- Full Text
- View/download PDF
9. A New Viewpoint on Two-Level Logic Minimization.
- Author
-
Olivier Coudert, Jean Christophe Madre, and Henri Fraisse
- Published
- 1993
- Full Text
- View/download PDF
10. Return on Investment on AI : The Case of Capital Requirement
- Author
-
Henri Fraisse and Matthias Laporte
- Subjects
Incentive ,Computer science ,business.industry ,Deep learning ,Return on investment ,Internal model ,Econometrics ,Capital requirement ,Default ,Gradient boosting ,Artificial intelligence ,business ,Credit risk - Abstract
Taking advantage of granular data we measure the change in bank capital requirement resulting from the implementation of AI techniques to predict corporate defaults. For each of the largest banks operating in France we design an algorithm to build pseudo-internal models of credit risk management for a range of methodologies extensively used in AI (random forest, gradient boosting, ridge regression, deep learning). We compare these models to the traditional model usually in place that basically relies on a combination of logistic regression and expert judgement. The comparison is made along two sets of criterias capturing : the ability to pass compliance tests used by the regulators during onsite missions of model validation (i), and the induced changes in capital requirement (ii). The different models show noticeable differences in their ability to pass the regulatory tests and to lead to a reduction in capital requirement. While displaying a similar ability than the traditional model to pass compliance tests, neural networks provide the strongest incentive for banks to apply AI models for their internal model of credit risk of corporate businesses as they lead in some cases to sizeable reduction in capital requirement.
- Published
- 2021
- Full Text
- View/download PDF
11. A Domain-Specific Architecture for Accelerating Sparse Matrix Vector Multiplication on FPGAs
- Author
-
Lisa Liu, Henri Fraisse, Mansimran Benipal, Hossein Omidian, Abhishek Kumar Jain, and Dinesh D. Gaitonde
- Subjects
010302 applied physics ,Modularity (networks) ,Memory hierarchy ,Plug and play ,Computer science ,Sparse matrix-vector multiplication ,02 engineering and technology ,Parallel computing ,01 natural sciences ,020202 computer hardware & architecture ,0103 physical sciences ,0202 electrical engineering, electronic engineering, information engineering ,Routing (electronic design automation) ,Field-programmable gate array ,Block (data storage) ,Efficient energy use - Abstract
FPGAs allow custom memory hierarchy and flexible data movement with highly fine-grained control. These capabilities are critical for building high performance and energy efficient domain-specific architectures (DSAs), especially for workloads with irregular memory access and data-dependent communication patterns. Sparse linear algebra operations, especially sparse matrix vector multiplication (SpMV), are examples of such workloads and are becoming important due to their use in numerous areas of science and engineering. Existing FPGA-based DSAs for SpMV do not allow customization through plug and play of the building blocks. For example, most of these DSAs require switching network/crossbar architecture as a building block for routing matrix data to banked vector memory blocks. In this paper, we first present an approach where a custom network is built using simple blocks arranged in a regular fashion to exploit low-level architecture details. Further, we make use of this network to replace expensive crossbars employed in GEMX SpMV engine and develop an end-to-end tool-flow around mixed IP approach (HLS/RTL). Due to the modularity of our design, our tool-flow allows us to insert an additional block in the design to guarantee zero-stall from the accumulation stage. On Alveo U200, we report performance numbers of up to 4.4 GFLOPS (92% peak bandwidth utilization) using our accelerator (attached with one DDR4).
- Published
- 2020
- Full Text
- View/download PDF
12. Return on investment on artificial intelligence: The case of bank capital requirement
- Author
-
Henri Fraisse and Matthias Laporte
- Subjects
Economics and Econometrics ,Finance - Published
- 2022
- Full Text
- View/download PDF
13. Can the Provision of Long-Term Liquidity Help to Avoid a Credit Crunch? Evidence from the Eurosystem’s LTRO
- Author
-
Philippe Andrade, Christophe Cahn, Jean-Stéphane Mésonnier, and Henri Fraisse
- Subjects
biology ,05 social sciences ,Control (management) ,Euros ,Financial system ,biology.organism_classification ,Market liquidity ,Loan ,Capital (economics) ,0502 economics and business ,Financial crisis ,Credit crunch ,Business ,050207 economics ,Baseline (configuration management) ,General Economics, Econometrics and Finance ,050205 econometrics - Abstract
We exploit the Eurosystem’s longer-term refinancing operations (LTROs) of 2011–2012 to assess whether a large provision of central bank liquidity to banks during a financial crisis has a positive impact on banks’ credit supply to firms. We control for credit demand by examining firms that borrow from several banks, in addition to controlling for confounding factors at the level of banks. We find that the LTROs enhanced loan supply: according to our baseline estimate, banks borrowing 1 billion euros from the facility increased their loan supply by 186 million euros over one year. We also find that the transmission mostly took place with the first operation of December 2011, in which banks that were more capital constrained bid more. Moreover, we show that the opportunity to substitute long-term central bank liquidity for short-term liquidity enhanced this transmission. Lastly, the operations benefited larger borrowers more and did not lead banks to increase their lending to riskier firms.
- Published
- 2018
- Full Text
- View/download PDF
14. Households Debt Restructuring: The Re-default Effects of a Debt Suspension
- Author
-
Henri Fraisse
- Subjects
Finance ,Organizational Behavior and Human Resource Management ,Economics and Econometrics ,050208 finance ,Restructuring ,business.industry ,05 social sciences ,Debt-to-GDP ratio ,Recourse debt ,Monetary economics ,External debt ,Debt restructuring ,0502 economics and business ,Debt ratio ,Business ,Internal debt ,050207 economics ,Debt levels and flows ,Law - Abstract
When facing financial distress, French households can file a case to a “households’ over-indebtedness commission” (HDC). The HDC can order an immediate repayment or grant a debt suspension. Exploiting the random assignment of bankruptcy filings to managers, we show that a debt suspension has a very significant and negative effect on the likelihood to re-default but that this impact is only short-lived. The effect depends not only on the characteristics of the households but also on the nature of their indebtedness. Our results imply that rather than focusing on a specific debt profile, above all a deeper restructuring of the expenditure side is necessary to make the plan sustainable in case of an uniform increase of the HDC severity. They also single out specific banks lending to particular fragile households. They indicate the importance of policy actions on budget counseling, as well as the importance of regulation of credit distribution to avoid both entering into bankruptcy and re-filing for bankruptcy.
- Published
- 2017
- Full Text
- View/download PDF
15. Lower Bank Capital Requirements as a Policy Tool to Support Credit to SMEs: Evidence From a Policy Experiment?
- Author
-
Henri Fraisse, Mathias Lé, Michel Dietsch, and Sandrine Lecarpentier
- Subjects
Bank capital ,business.industry ,Economic capital ,Capital requirement ,Distribution (economics) ,Financial system ,European commission ,Business ,Directive ,Basel III ,Credit risk - Abstract
Starting in 2014 with the implementation of the European Commission Capital Requirement Directive, banks operating in the Euro area were benefiting from a 25% reduction (the Supporting Factor or "SF" hereafter) in their own funds requirements against Small and Medium-sized enterprises ("SMEs" hereafter) loans. We investigate empirically whether this reduction has supported SME financing and to which extent it is consistent with SME credit risk. Economic capital computations based on multifactor models do confirm that capital requirements should be lower for SMEs. Taking into account the uncertainty surrounding their estimates and adopting a conservative approach, we show that the SF is consistent with the difference in economic capital between SMEs and large corporates. As for the impact on credit distribution, our difference-in-differences specification enables us to find a positive and significant impact of the SF on the credit supply
- Published
- 2020
- Full Text
- View/download PDF
16. SAT Based Place-And-Route for High-Speed Designs on 2.5D FPGAs
- Author
-
Dinesh D. Gaitonde, Chirag Ravishankar, and Henri Fraisse
- Subjects
business.industry ,Computer science ,Interface (computing) ,020208 electrical & electronic engineering ,Spec# ,02 engineering and technology ,Clock skew ,020202 computer hardware & architecture ,Embedded system ,Scalability ,0202 electrical engineering, electronic engineering, information engineering ,Place and route ,Routing (electronic design automation) ,Field-programmable gate array ,business ,computer ,Communication channel ,computer.programming_language - Abstract
2.5D stacking technology allows us to build high performance and high capacity FPGA devices at reasonable costs. The communication between multiple dies happen on a passive silicon interposer at high speed, which pose several interesting challenges. Due to clock skew characteristics across multiple dies and increase in the min-max spread of delays, place-and-route tools need to address inter-die hold violations and optimize for performance. We implement a tractable SAT based methodology to achieve this by minimally detouring data paths to meet all hold requirements while optimizing performance. We also confine the solution to a small window around each inter-die (Laguna) channel to reduce routing resource utilization, congestion, and scale the methodology to any Laguna channel utilization. We improve performance across the interface by 11% compared to a state-of-the-art commercial flow and meet a 500MHz spec on Xilinx(R) UltraScale+(TM) devices in 2E speedgrade. We address the scalability concerns of SAT and show how we can use this in practice with negligible runtimes in implementation tools. Our solution paves the way for FPGA-as-a-service platforms where fast inter-die communication, that does not interfere with user specific logic, is pivotal to their success.
- Published
- 2018
- Full Text
- View/download PDF
17. A SAT-based Timing Driven Place and Route Flow for Critical Soft IP
- Author
-
Dinesh D. Gaitonde and Henri Fraisse
- Subjects
Emulation ,Computer science ,business.industry ,Soft IP ,02 engineering and technology ,Timing closure ,020202 computer hardware & architecture ,Embedded system ,Scalability ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Place and route ,Boolean satisfiability problem ,Field-programmable gate array ,business ,Hardware_LOGICDESIGN ,PCI Express - Abstract
Many FPGA designs contain soft IP tightly connected to hard blocks such as on-chip Processor, PCIE or IOs. Generally, these soft IPs pose significant timing closure challenges. In this paper, we propose a timing-driven Place and Route flow based on Boolean Satisfiability (SAT). Its main advantages over previous SAT-based approaches are its improved scalability and its timing awareness. We validate our flow using an IP targeting the emulation market. We demonstrate that our flow can significantly improve the usable bandwidth of FPGA IOs. Since the proposed flow is SAT based, the performance does not depend on specific ways in which more traditional place and route are usually tuned.
- Published
- 2018
- Full Text
- View/download PDF
18. Sentiment de sécurité de l'emploi : l'effet des indemnités chômage et de la justice prud'homale
- Author
-
Henri Fraisse, Corinne Prost, and Laurence Rioux
- Subjects
Business and International Management ,General Economics, Econometrics and Finance - Published
- 2015
- Full Text
- View/download PDF
19. Automated extra pipeline analysis of applications mapped to Xilinx UltraScale+ FPGAs
- Author
-
Henri Fraisse, Ilya K. Ganusov, Rafael Trapani Possignolo, Ng Aaron, and Sabyasachi Das
- Subjects
Computer science ,business.industry ,Pipeline (computing) ,Overhead (engineering) ,0211 other engineering and technologies ,02 engineering and technology ,020202 computer hardware & architecture ,Set (abstract data type) ,Computer architecture ,Embedded system ,0202 electrical engineering, electronic engineering, information engineering ,Benchmark (computing) ,Algorithm design ,Hardware_ARITHMETICANDLOGICSTRUCTURES ,Performance improvement ,Heuristics ,business ,Field-programmable gate array ,Hardware_REGISTER-TRANSFER-LEVELIMPLEMENTATION ,021106 design practice & management - Abstract
This paper describes the methodology and algorithms behind extra pipeline analysis tools released in the Xilinx Vivado Design Suite version 2015.3. Extra pipelining is one of the most effective ways to improve performance of FPGA applications. Manual pipelining, however, often requires significant efforts from FPGA designers who need to explore various changes in the RTL and re-run the flow iteratively. The automatic pipelining approach described in this paper, in contrast, allows FPGA users to explore latency vs. performance trade-offs of their designs before investing time and effort into modifying RTL. We describe algorithms behind these tools which use simple cut heuristics to maximize performance improvement while minimizing additional latency and register overhead. To demonstrate the effectiveness of the proposed approach, we analyse a set of 93 commercial FPGA applications and IP blocks mapped to Xilinx UltraScale+ and UltraScale generations of FPGAs. The results show that extra pipelining can provide from 18% to 29% potential Fmax improvement on average. It also shows that the distribution of improvements is bimodal, with almost half of benchmark suite designs showing no improvement due to the presence of large loops. Finally, we demonstrate that highly-pipelined designs map well to UltraScale+ and UltraScale FPGA architectures. Our approach demonstrates 19% and 20% Fmax improvement potential for the UltraScale+ and UltraScale architectures respectively, with the majority of applications reaching their loop limit through pipelining.
- Published
- 2016
- Full Text
- View/download PDF
20. Les commissions de surendettement des ménages : de l’objectif de négociation à la prévention de la rechute
- Author
-
Anne Muller and Henri Fraisse
- Subjects
Statistics and Probability ,Economics and Econometrics ,Sociology and Political Science - Abstract
Mit dem so genannten Neiertz-Gesetz von 1990 wurden Überschuldungsausschüsse eingerichtet, die für die gemeinsamen Verfahren zur Umstrukturierung der Schulden von Privathaushalten zuständig sind, die bei der Tilgung auf Schwierigkeiten stoßen. Ziel war es ursprünglich, im Wege von Verhandlungen eine Vereinbarung zwischen den Privathaushalten und ihren Gläubigern zu treffen. Aufgrund der Zunahme der betroffenen Haushalte hat ihnen der Gesetzgeber anschließend - implizit oder nicht •den Auftrag erteilt, die Zahl der eingereichten Anträge oder ihre Neueinreichung zu verringern. Wir bewerten die Determinanten des zwischen 2007 und 2009 verfolgten Ansatzes : Unzulässigkeit des Antrags, Vereinbarung im Wege von Verhandlungen oder gerichtliche Lösungen Danach prüfen wir die Faktoren der erneuten Überschuldung der Privathaushalte im Jahr 2007. Geringe Einkünfte, laufende Aufwendungen und hohe Schulden sind den Verhandlungen abträglich. Hinzu kommen Koordinierungsprobleme : Die Vielzahl der Gläubiger und eine große Streuung der Schulden verringern die Möglichkeit einer Einigung. Die lokalen wirtschaftlichen Verhältnisse, das strenge Verhalten der Ausschüsse und die Identität der Gläubiger beeinflussen außerdem das Verfahren. Den Ausschüssen gelingt es aber, im Wege von Verhandlungen eine Vereinbarung für überschuldete Haushalte zu erzielen, die sich in den prekärsten Beschäftigungsverhältnissen befinden. Innerhalb von zwei Jahren wird im Schnitt jeder vierte überschuldete Privathaushalt, von dem eine teilweise Tilgung seiner Schulden verlangt wurde, rückfällig. Die anfängliche Situation, d. h. die Einreichung des Antrags, stellt den wichtigsten Faktor für die Rückfälligkeit dar. Die überschuldeten Haushalte, denen eine Stundung gewährt wurde, wären dreimal so viel rückfällig geworden, und diejenigen, deren Antrag für unzulässig erklärt wurde, etwas mehr als ein Zehntel, wenn man ihnen Tilgungspläne empfohlen hätte. So schließen die Überschuldungsausschüsse vom Verfahren Fälle aus, bei denen die Möglichkeit von Rückfällen gering gewesen wären. Dagegen behandeln sie Haushalte, die weniger Risiken in sich bergen, in Form von Streichung oder Stundung ihrer Schulden weniger streng., The 1990 Neiertz Act established overindebtendess commissions” in charge of collective proceedings for restructuring the debt of French households unable to meet their obligations. The initial goal was to negotiate amicable settlements between households and their creditors. To cope with the growing number of households concerned, lawmakers tasked the commissions -implicitly or not- with reducing the volume of initial and •repeat” applications for help. We assess the determinants of the outcomes of applications reviewed between 2007 and 2009 : inadmissibility, amicable settlement, or court-ordered solutions. We then seek the factors responsible for relapses into overindebtedness among households assisted by the commissions in 2007. Low income, high current expenses, and heavy debt are detrimental to negotiation. Coordination issues also play a role : an abundance of creditors and high debt dispersion make a settlement less likely. The outcome is also influenced by local economic conditions, the strictness of the commissions, and the identity of creditors. However, the commissions do manage to arrive at negotiated solutions for overindebted households with the most precarious employment statuses. Over a two-year period, approximately one in four overindebted households that have been ordered to repay part of their debt relapse. The main explanatory factor is the household’s initial situation, i. e., at the time it files its case with the commission. According to our estimates, one in three overindebted households that have been granted a moratorium relapse into debt ; among households to whom the commissions recommended repayment plans, the proportion is slightly over one in ten. We conclude that the commissions reject applications from households that would have been unlikely to relapse. By contrast, they are more lenient towards households at greatest risk, by recommending debt forgiveness or a moratorium., La loi Neiertz de 1990 a instauré des commissions de surendettement en charge des procédures collectives de restructuration de la dette des ménages rencontrant des difficultés de remboursement. Leur objectif initial était d’arriver à un accord négocié entre le ménage et ses créanciers. Confronté à l’augmentation du nombre de ménages concernés, le législateur leur a ensuite confié implicitement ou non les missions de réduire le nombre de dossiers déposés, et de limiter les « redépôts ». Nous évaluons les déterminants de l’orientation suivie par un dossier considéré entre 2007 et 2009 : irrecevabilité, accord négocié ou solutions judiciaires. Nous cherchons ensuite les facteurs de rechute dans le surendettement des ménages orientés en 2007. Des revenus faibles, des charges courantes et des dettes élevées nuisent à la négociation. Des problèmes de coordination sont à l’oeuvre : un grand nombre de créanciers et une grande dispersion de la dette réduisent la possibilité d’un accord. Par ailleurs, les conditions économiques locales, la sévérité des commissions et l’identité des créanciers influencent l’issue de la procédure. Cependant, les commissions arrivent à une solution négociée pour des ménages surendettés aux situations en emploi les plus précaires. Sur deux ans, un ménage surendetté à qui a été demandé de rembourser une partie de sa dette rechute environ une fois sur quatre. Sa situation initiale c’est-à-dire lors du dépôt du dossier est le principal facteur explicatif de cette rechute. Les ménages surendettés bénéficiant d’un moratoire auraient rechuté à hauteur d’une fois sur trois et ceux déclarés irrecevables un peu plus d’une fois sur dix si on leur avait préconisé des plans de remboursements. Ainsi, les commissions de surendettement excluent de la procédure des dossiers qui auraient eu peu de chances de rechuter. En revanche, elles réservent un traitement moins sévère sous la forme d’effacement ou de moratoire aux ménages les plus en risque, La ley Neiertz de 1990 instituyó comisiones de sobreendeudamiento para la ejecución de los procedimientos colectivos de la reestructuración de la deuda de aquellas familias que presentasen dificultades de reembolso. Su objetivo inicial era el de conseguir un acuerdo negociado entre el hogar deudor y sus acreedores. Ante el aumento del número de familias afectadas, el legislador les confió posteriormente •implícitamente o no-la misión de reducir el número de solicitudes y de •segundas solicitudes”. Examinamos las resultados en los que podía desembocar una solicitud tratada entre 2007 y 2009 : inadmisibilidad, acuerdo negociado o soluciones judiciales. Seguidamente, buscamos los factores de recaída en el sobreendeudamiento de las familias orientadas en 2007. Los bajos ingresos, los gastos corrientes y las deudas elevadas dificultan la negociación. También entran en juego problemas de coordinación, puesto que un gran número de acreedores y una deuda dispersada reducen la posibilidad de obtener un acuerdo. Asimismo, las condiciones económicas locales, la rigidez de las comisiones y la identidad de los acreedores influyen en el resultado del procedimiento. No obstante, las comisiones consiguen obtener un acuerdo para aquellas familias sobreendeudadas en situaciones de empleo extremadamente precarias. En un periodo de dos años, una de cada cuatro familias a las que se les solicita el rembolso parcial de la deuda vuelve a caer. Su situación inicial •es decir, en el momento de la presentación de la solicitud-es el principal factor que explica la recaída. Las familias sobreendeudadas a las que se les concedió una moratoria habrían recaído hasta una de cada tres veces, y aquellas declaradas no admisibles, tras haberles recomendado planes de rembolso, un poco más de una de cada diez veces. De esta manera, las comisiones de sobreendeudamiento excluyen del procedimiento a aquellas solicitudes poco susceptibles de recaída. Por otro lado, aplican unos criterios más flexibles en cuanto a la aplicación de condonación o moratoria para aquellas familias en mayor situación de riesgo., Fraisse Henri, Muller Anne. Les commissions de surendettement des ménages : de l’objectif de négociation à la prévention de la rechute. In: Economie et statistique, n°443, 2011. pp. 3-27.
- Published
- 2011
- Full Text
- View/download PDF
21. Support for the SME Supporting Factor: Multi-Country Empirical Evidence on Systematic Risk Factor for SME Loans
- Author
-
Michel Dietsch, Klaus Düllmann, Henri Fraisse, Philipp Koziol, and Christine Ott
- Published
- 2016
- Full Text
- View/download PDF
22. The Competitive Effect of a Bank Megamerger on Credit Supply
- Author
-
Henri Fraisse, Johan Hombert, Mathias Lé, Haldemann, Antoine, Groupement de Recherche et d'Etudes en Gestion à HEC (GREGH), Ecole des Hautes Etudes Commerciales (HEC Paris)-Centre National de la Recherche Scientifique (CNRS), and HEC Research Paper Series
- Subjects
Economics and Econometrics ,Exploit ,JEL: L - Industrial Organization/L.L1 - Market Structure, Firm Strategy, and Market Performance/L.L1.L13 - Oligopoly and Other Imperfect Markets ,Financial system ,Banking competition ,Monetary economics ,Bank megamerger ,Competition (economics) ,Bank credit ,JEL: G - Financial Economics/G.G2 - Financial Institutions and Services/G.G2.G21 - Banks • Depository Institutions • Micro Finance Institutions • Mortgages ,Credit history ,0502 economics and business ,040101 forestry ,050208 finance ,Merger ,05 social sciences ,04 agricultural and veterinary sciences ,Start up ,Investment (macroeconomics) ,Credit Supply ,0401 agriculture, forestry, and fisheries ,Bond market ,[SHS.GESTION]Humanities and Social Sciences/Business administration ,Credit crunch ,Business ,Credit valuation adjustment ,[SHS.GESTION] Humanities and Social Sciences/Business administration ,Finance - Abstract
We examine how the merger between two European megabanks affects credit supply to small and medium-sized businesses. Using loan-level and firm-level data, we exploit variation in the merging banks' market overlap to identify the competition effect of the merger. We find that the merged bank decreases the supply of credit to existing firms and new firms. This effect is not offset by other banks increasing their lending, leading to an overall decline in bank credit. This reduction in credit supply is associated with higher firm exit. However, for continuing firms, the merger has no adverse effects on investment and employment.
- Published
- 2016
23. Can the Provision of Long-Term Liquidity Help to Avoid a Credit Crunch? Evidence from the Eurosystem's LTROs
- Author
-
Philippe Andrade, Jean-Stéphane Mésonnier, Henri Fraisse, and Christophe Cahn
- Subjects
Exploit ,jel:E51 ,Control (management) ,Financial system ,jel:C21 ,jel:G21 ,Market liquidity ,Term (time) ,jel:G28 ,Central bank ,Loan ,unconventional monetary policy, bank lending channel, euro area, LTRO, credit supply ,Credit crunch ,Business - Abstract
We exploit the Eurosystem’s longer-term refinancing operations (LTROs) of 2011-2012 to analyze the effects that a large provision of central bank liquidity to banks has on the credit supply to firms. We control for credit demand by examining firms that borrow from several banks, in addition to controlling for banks’ risk. We find that LTROs enhanced loan supply in France. Nevertheless, the transmission took place mostly with the first operation of December 2011, in which constrained banks bid more, and larger borrowers benefited more. The opportunity to substitute long-term central bank borrowing for short-term borrowing was instrumental in this transmission.
- Published
- 2015
- Full Text
- View/download PDF
24. Labor Disputes and Job Flows
- Author
-
Corinne Prost, Henri Fraisse, and Francis Kramarz
- Subjects
Organizational Behavior and Human Resource Management ,Labour economics ,Unfair dismissal ,Employment protection legislation ,Strategy and Management ,employment protection legislation, job flows, labor judges, unfair dismissal, France ,jel:J53 ,jel:J32 ,jel:K31 ,jel:J63 ,Labor relations ,Management of Technology and Innovation ,Economics ,Labor disputes - Abstract
This article uses variations in local conditions of the activity of the labor courts to assess the effect of dismissal costs on the labor market. Judicial activity is analyzed using a data set of individual labor disputes brought to French courts over the years 1996 to 2003. Several indicators are computed: the percentage of dismissed workers who litigate in employment tribunals, the fraction of cases leading to a conciliation between parties, to a trial, resulting in a workers victory. First, we present a simple theoretical framework helping us understand the links between litigation costs, judicial outcomes and firing costs. Court outcomes are not exogenous to market conditions but also to litigation costs: a large filing rate can come from small litigation costs for the workers, leading to large dismissal costs for the firms; it may well come from small litigation costs for the firms, the employers taking more risks when firing workers. Second, we regress job flows on indicators of judicial outcomes, using an instrument, based on local shocks in the supply of lawyers. We find that when the numbers of lawyers increase, workers litigate more often, which should increase the firing costs for the firms. This increased filing rate causes a decrease in employment fluctuations, especially for shrinking or exiting firms. The effect on employment growth is positive in the short term.
- Published
- 2014
25. The Real Effects of Bank Capital Requirements
- Author
-
David Thesmar, Mathias Lé, and Henri Fraisse
- Subjects
Finance ,Capital adequacy ratio ,Physical capital ,Financial capital ,business.industry ,Economic capital ,Risk-adjusted return on capital ,Capital employed ,Capital requirement ,Capital intensity ,Monetary economics ,business - Abstract
We measure the impact of bank capital requirements on corporate borrowing and investment using loan-level data. The Basel II regulatory framework makes capital requirements vary across both banks and across firms, which allows us to control for firm-level credit demand shocks and bank-level credit supply shocks. We find that a 1 percentage point increase in capital requirements reduces lending by 10%. Firms can attenuate this reduction by substituting borrowing across banks, but only partially. The resulting reduction in borrowing capacity impacts investment, but not working capital: Fixed assets are reduced by 2.6%, but lending to customers is unaffected.
- Published
- 2013
- Full Text
- View/download PDF
26. Sentiment de sécurité de l’emploi : l’effet des indemnités chômage et de la justice prud’homale
- Author
-
Laurence Rioux, Henri Fraisse, and Corinne Prost
- Subjects
Business and International Management ,JEL classifications J28 - J65 - J32 - J53 - J63 - K31 ,France ,labour justice ,perceived job security ,unemployment insurance ,employment protection legislation ,General Economics, Econometrics and Finance ,sentiment de sécurité de l’emploi ,législation sur la protection de l’emploi ,assurance chômage ,Classifications JEL J28 - J65 - J32 - J53 - J63 - K31 ,prud’hommes - Abstract
Perceived Job Security : The Effects of Unemployment Insurance and Labour Court Activity. We analyse the effects of unemployment insurance and employment protection legislation onworkers’perception of job security. The French sample of the European Community Household Panel provides an indicator of workers’ perceived job security and enables us to simulate the potential unemployment insurance benefits an employeewould receive in case of job loss. Exploiting a data set of unfair dismissal cases brought to labour courts, we compute theworkers’mean victory rate and the mean time before a case is heard across all French départements. We find that an increase in maximum compensation duration has a strong positive impact on perceived job security. A higher worker victory rate and speedier labour court processing are found to significantly improve perceived job security., Cet article analyse conjointement les effets de l’indemnisation chômage et de la justice prud’homale sur le sentiment de sécurité de l’emploi des salariés. L’échantillon français de l’ECHP (European Community Household Panel) fournit une mesure de la satisfaction des salariés vis-à-vis de la sécurité de leur emploi et permet de simuler les droits à l’assurance chômage de chacun d’eux. Les données sur les prud’hommes permettent en outre de construire plusieurs indicateurs de justice prud’homale au niveau des départements, dont le taux de victoire des salariés et la duréemoyenne des procédures. Nos résultats économétriques indiquent que la durée de l’indemnisation du chômage et les conditions de justice prud’homale améliorent la satisfaction des salariés vis-à-vis de la sécurité perçue de l’emploi., Fraisse Henri, Prost Corinne, Rioux Laurence. Sentiment de sécurité de l’emploi : l’effet des indemnités chômage et de la justice prud’homale. In: Économie & prévision, n°202-203, 2013. Economie du droit. pp. 101-120.
- Published
- 2013
27. Euro Area Labour Markets and the Crisis
- Author
-
Robert Anderton, Mario Izquierdo, Ted Aranki, Boele Bonthuis, Katarzyna Barbara Budnik, Ramon Gomez Salvador, Valerie Jarvis, Ana Lamo, Aidan Meyler, Daphne Momferatou, Roberta Serafini, Magdalena Spooner, Martine Druant, Jan De Mulder, Katja Sonderhof, Daniel Radowski, Orsolya Soosaar, Natalja Viilmann, Suzanne Linehan, Daphne Nicolitsas, Sergio Puente, Cristina Fernandez, Gregory Verdugo, Matteo Mogliani, Henri Fraisse, Roberta Zizza, Michalis Ktoris, Cindy Veiga Nunes, Muriel Bouchet, Sandra Zerafa, Ian Sapiano, Marco M. Hoeberichts, Jante Parlevliet, Alfred Stiglbauer, Paul Ramskogler, Jose Maria, Claudia Duarte, Manca Jesenko, Helena Solcanska, Pavel Gertler, Juuso Vanhala, and Heidi Schauman
- Published
- 2012
- Full Text
- View/download PDF
28. Changes in Wage Inequality in France: The Impact of Composition Effects (in French)
- Author
-
Gregory Verdugo, Henri Fraisse, and Guillaume Horny
- Subjects
Decile ,Wage inequality ,Wage Inequality, France ,Labour economics ,jel:J3 ,Inequality ,media_common.quotation_subject ,Efficiency wage ,Economics ,Wage ,jel:D3 ,media_common - Abstract
This paper investigates the recent changes in the French wage structure from 1990 to 2008. To do so, we disentangle the impact of changes in employment probability, changes in the levels of education and experience and changes in the price of labor. Unlike other developped countries, we find that upper and lower tail inequality decline between the first and the last decile for male and female. The recent period thus could be described as a period of “great compression” of wages between the first and the last decile. As a result, the decline in education and experience returns has produced one of the most egalitarian wage structure ever observed in France since the 1960s.
- Published
- 2012
- Full Text
- View/download PDF
29. Évolution des inégalités salariales en France: Le rôle des effets de composition
- Author
-
Guillaume Horny, Henri Fraisse, Gregory Verdugo, Centre d'économie de la Sorbonne (CES), Université Paris 1 Panthéon-Sorbonne (UP1)-Centre National de la Recherche Scientifique (CNRS), Observatoire français des conjonctures économiques (Sciences Po) (OFCE), Sciences Po (Sciences Po), Centre de recherche de la Banque de France, Banque de France, and Observatoire français des conjonctures économiques (OFCE)
- Subjects
050208 finance ,jel:J3 ,0502 economics and business ,05 social sciences ,inégalités salariales ,jel:D3 ,France ,050207 economics ,General Economics, Econometrics and Finance ,ComputingMilieux_MISCELLANEOUS ,[SHS]Humanities and Social Sciences - Abstract
Cet article etudie l’evolution de la distribution des salaires en France de 1990 a 2008. Nous isolons l’effet des variations de la probabilite d’emploi et de la qualification de l’effet des changements du prix du travail. Contrairement a d’autres pays developpes, les inegalites a la fois dans le bas et dans le haut de la distribution entre le premier et le dernier decile reculent chez les hommes et les femmes. La periode recente peut ainsi etre qualifiee de periode de « grande compression » salariale entre le premier et le dernier decile. La baisse des rendements de l’education et de l’experience sur la periode a produit une structure des salaires en 2008 parmi la plus egalitaire jamais observee en France depuis les annees 1960.
- Published
- 2012
- Full Text
- View/download PDF
30. Labor Disputes and Labor Flows
- Author
-
Henri Fraisse, Francis Kramarz, and Corinne Prost
- Subjects
labor judges, labor flows, employment protection legislation, unfair dismissal, France ,jel:J53 ,jel:J32 ,jel:K31 ,jel:J63 - Abstract
About one in four workers challenges her dismissal in front of a labor court in France. Using a data set of individual labor disputes brought to French courts over the years 1996 to 2003, we examine the impact of labor court activity on labor market flows. First, we present a simple theoretical model showing the links between judicial costs and judicial case outcomes. Second, we exploit our model as well as the French institutional setting to generate instruments for these endogenous outcomes. In particular, we use shocks in the supply of lawyers who resettle close to their university of origin. Using these instruments, we show that labor court decisions have a causal effect on labor flows. More trials and more cases won by the workers cause more job destructions. More settlements, higher filing rates, and a larger fraction of workers represented by a lawyer dampen job destructions. Various robustness checks confirm these findings.
- Published
- 2011
31. Labour Disputes and the Game of Legal Representation
- Author
-
Henri Fraisse
- Subjects
litigation, lawyers, labour dispute resolution, prisoner’s dilemma ,jel:J53 ,jel:K41 ,jel:J52 - Abstract
This paper explores the prisoner’s dilemma that may result when workers and firms are involved in labour disputes and must decide whether to hire a lawyer to be represented at trial. Using a representative data set of labour disputes in the UK and a large population of French unfair dismissal cases, we find that a lawyer substantially increases the firm’s probability of winning at trial but has little effect on the worker’s victory probability. The UK data contain award and litigation costs and allow us to compute the pay-off matrix. We do not find evidence of a prisoner’s dilemma, given that the total pay-off for the worker is not significantly different whether she is represented or not. Surprisingly, the dominant strategy for the firm is not to be represented.
- Published
- 2010
32. Labor Court Inputs, Judicial Cases Outcomes and Labor Flows: Identifying Real EPL
- Author
-
Corinne Prost, Henri Fraisse, and Francis Kramarz
- Subjects
Labour economics ,Unfair dismissal ,Employment protection legislation ,Exploit ,Causal effect ,jel:J53 ,jel:J32 ,jel:K31 ,jel:J63 ,Labor relations ,Margin (finance) ,Economics ,Labor disputes ,Enforcement ,Employment protection legislation, Labor flows, Labor judges, Unfair dismissal, France - Abstract
Using a data set of individual labor disputes brought to court over the years 1990 to 2003 in France, we examine the impact of the enforcement of Employment Protection Legislation on labor market outcomes. First, we present a simple theoretical model showing that judicial case outcomes cannot be directly interpreted in terms of EPL. A large fraction of cases that go to trials may well be a sign of low firing costs when firms face low litigation costs and are therefore willing to go to court or a sign of high firing costs when workers face low litigation costs and are therefore willing to sue the firm. Second, we exploit our model as well as the French institutional setting to generate instruments for these endogenous outcomes. Using these instruments, we show that labor courts decisions have a causal effect on labor flows. More dropped cases and more trials cause more job destructions: more trials indeed are a sign of lower separation costs. More settlements, higher filing rates, a larger fraction of workers represented at trial, large lawyer density dampen job destruction. A larger judge density causes less job creation, in particular on the extensive margin.
- Published
- 2009
33. Model for Analysing and Forecasting Short Term Developments
- Author
-
Olivier de Bandt, Henri Fraisse, Jean-Pierre Villetelle, Mustapha Baghli, and Véronique Brunhes-Lesage
- Subjects
Rebasing ,Economy ,media_common.quotation_subject ,National accounts ,Economics ,Econometrics ,Wage ,Accounting framework ,Relative price ,Imperfect competition ,Interest rate ,media_common - Abstract
MASCOTTE is the new version of the Banque de France's macro-econometric forecasting model. Following the last rebasing of National Accounts (currently at 1995 price), the previous version of the model was simplified, re-specified and re-estimated. The model is essentially used for making macro-economic projections of the French economy over a two-to-three year horizon, which requires an accounting framework as close as possible to the French National Accounts. The main agents are companies, households, general government and the rest of the world. The new version now includes a supply block derived from the explicit optimisation behaviour of companies using a Cobb-Douglas technology under imperfect competition, and a new Wage Setting schedule. Full homogeneity of the nominal side of the model ensures the independence between the nominal equilibrium and the real equilibrium, the latter being only determined in the long run by relative prices. Furthermore, as regards the specification of equations, special attention was paid to the consequences of changes in short-term interest rates.
- Published
- 2004
- Full Text
- View/download PDF
34. Potential Output and Output Gap: Some Estimates for France (in French)
- Author
-
Olivier de Bandt, Henri Fraisse, Mustapha Baghli, philippe Rousseaux, Carine Bouthevillain, and Hervé Le Bihan
- Subjects
Output gap ,Statistics ,Hodrick–Prescott filter ,Business sector ,Econometrics ,Univariate ,NAIRU ,Economics ,Function (mathematics) ,Potential output ,Smoothing - Abstract
This Study and Research Paper is devoted to different estimates of the French economy's potential output and output gap. Several methods, which are presented in detail, are put forward to measure these indicators. The first two sections of the paper profile statistical univariate approaches: smoothing using the Hodrick-Prescott filter; and estimation of a trend, potentially including breaks. The next two sections extend the discussion on statistical techniques to multivariate cases. To be precise, they involve the analysis of structural VAR models and unobserved component models. The final section proposes a structural method for estimating potential output, where business sector output is described by a Cobb-Douglas function, while that of the non-business sector is assumed to be exogenous. For this structural method, the NAIRU has to be calculated before estimating the short to medium-term level of potential output.
- Published
- 2002
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.