505 results on '"Term (time)"'
Search Results
2. What We Talk About When We Talk About Compliance
- Author
-
Gaetano Presti
- Subjects
Incentive ,Relation (database) ,media_common.quotation_subject ,Political science ,Ambiguity ,Affect (linguistics) ,Business management ,Term (time) ,Compliance (psychology) ,Law and economics ,media_common - Abstract
The purpose of this concise and brief paper, which is deliberately slim in size and bibliography, is contained. First of all, it aims to show the existence of several forms of compliance characterized by different geneses and purposes and, therefore, to highlight the ambiguity of a term that is not always used in a univocal sense, due also to its links with other issues that are currently highly debated. Moreover, it argues that despite the different forms and meanings that compliance can assume, it is possible to identify a common unifying thread. On the basis of a positive assessment of the development of compliance, especially in relation to the ever-increasing importance of legal and reputational risks associated with economic activity, the paper ends with a brief overview of the legal instruments available to support this trend or, more properly, to ensure that the proclaimed importance of compliance is accompanied by effective implementation suitable to positively affect business management.
- Published
- 2021
3. The Dynamics of the Main Fertility Elements Content in a Long-Term Agrochemical Stationary Experiment
- Author
-
Elena Kushaeva, Roman Timoshinov, Alexey Klykov, Lyudmila Marchuk, and Alexandr Dubkov
- Subjects
Agrochemical ,business.industry ,media_common.quotation_subject ,Environmental science ,Fertility ,Agricultural engineering ,business ,Term (time) ,media_common - Published
- 2021
4. Intelligent Construction for Infrastructure—The Framework
- Author
-
George K. Chang, António Gomes Correia, Guanghui Xu, and Soheil Nazarian
- Subjects
Process (engineering) ,Status quo ,business.industry ,Computer science ,media_common.quotation_subject ,Big data ,Information technology ,computer.software_genre ,Expert system ,Term (time) ,Engineering management ,Paradigm shift ,business ,computer ,Transportation infrastructure ,media_common - Abstract
Due to the impact of modern information technology, the construction of transportation infrastructure has entered the “Intelligent Era.” The conventional construction technology is also undergoing a paradigm shift. In this paper, the term “intelligent construction” is defined, and the status quo of global transportation infrastructure construction with emerging intelligent construction technologies is summarized. The framework of intelligent construction is presented on all aspects of the integration of modern information technology and the conventional highway technologies. The essential elements of intelligent construction include sensing, analysis, decision-making, and execution. Big data, machine learning, and expert system are applied to provide practical technical solutions for intelligent construction. This paper also elaborates on the process of intelligent decision-making and auto-feedback machine controls. Finally, the future development of intelligent construction is laid out. The globally coordinated efforts by the International Society for Intelligent Construction (ISIC) will help the leading development and implementation of intelligent construction into the future.
- Published
- 2021
5. Reducing Waiting Time for Orthopaedic Consultation Through a Continuous Improvement Approach
- Author
-
Rui M. Lima, Bruno Gonçalves, José Dinis-Carvalho, and Elisa Vieira
- Subjects
Waiting time ,Service (business) ,business.industry ,030503 health policy & services ,media_common.quotation_subject ,medicine.disease ,Outpatient service ,Triage ,3. Good health ,Term (time) ,03 medical and health sciences ,0302 clinical medicine ,Medicine ,Quality (business) ,030212 general & internal medicine ,Medical emergency ,0305 other medical science ,business ,media_common - Abstract
The outpatient service of public hospitals has had many difficulties in responding to the high demand that are being subjected to. The waiting times for a medical appointment are well above the clinically recommended waiting time, compromising the patients' health status. To combat this trend, health authorities have set the maximum response time for consultations, financially penalizing hospitals for not meeting these times. However, these penalties only act as an additional problem for hospitals. Given the relevance of improving the care performance provided to patients, in terms of quality and celerity, this article presents the study carried out at the outpatient service of a Portuguese hospital. This study presents a set of actions implemented as the result of a simple Continuous Improvement (CI) approach. That CI approach was inspired on Toyota Kata and the results obtained are translate into a reduction of the waiting time for triage by 90% and the waiting time for the first orthopaedic consultation by 27%, for average values in days. The implemented actions were also planned to promote in the long term an increase in the availability of the service, allowing an additional reduction in the waiting time for consultation.
- Published
- 2021
6. Digital Twins in Remote Labs
- Author
-
Heinz-Dietrich Wuttke, Karsten Henke, and René Hutschenreuter
- Subjects
0303 health sciences ,Amazon web services ,Multimedia ,business.industry ,Computer science ,media_common.quotation_subject ,Early detection ,computer.software_genre ,Term (time) ,03 medical and health sciences ,0302 clinical medicine ,Virtual machine ,030220 oncology & carcinogenesis ,Quality (business) ,The Internet ,State (computer science) ,Internet of Things ,business ,computer ,030304 developmental biology ,media_common - Abstract
Accelerated by current research on the Internet of Things, working with virtual models and things has taken a significant step forward. The term digital twins refers to the development of virtual models of processes, products or services that allow the simulation of system behavior and the early detection of problems before the real system exists. Simulations have been state of the art for a long time, but with the service-oriented possibilities of the Internet a new quality of simulation results. In the meantime, it has become possible, for example, to configure a virtual computer on the Internet and run algorithms on it without this computer in fact existing, see e.g. AWS (AMAZON web services). This paper discusses possibilities that the digital twin concept can offer to educational purposes in remote and virtual laboratories.
- Published
- 2019
- Full Text
- View/download PDF
7. Data science approach to stock prices forecasting in Indonesia during Covid-19 using Long Short-Term Memory (LSTM)
- Author
-
Widodo Budiharto
- Subjects
lcsh:Computer engineering. Computer hardware ,Information Systems and Management ,Computer Networks and Communications ,Computer science ,media_common.quotation_subject ,Big data ,lcsh:TK7885-7895 ,02 engineering and technology ,lcsh:QA75.5-76.95 ,Data science ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Computational statistics ,Stock (geology) ,media_common ,lcsh:T58.5-58.64 ,Stock market ,lcsh:Information technology ,business.industry ,Deep learning ,Research ,Closing (real estate) ,Term (time) ,Hardware and Architecture ,Value (economics) ,020201 artificial intelligence & image processing ,lcsh:Electronic computers. Computer science ,Artificial intelligence ,business ,LSTM ,Finance ,Information Systems ,Forecasting - Abstract
Background Stock market process is full of uncertainty; hence stock prices forecasting very important in finance and business. For stockbrokers, understanding trends and supported by prediction software for forecasting is very important for decision making. This paper proposes a data science model for stock prices forecasting in Indonesian exchange based on the statistical computing based on R language and Long Short-Term Memory (LSTM). Findings The first Covid-19 (Coronavirus disease-19) confirmed case in Indonesia is on 2 March 2020. After that, the composite stock price index has plunged 28% since the start of the year and the share prices of cigarette producers and banks in the midst of the corona pandemic reached their lowest value on March 24, 2020. We use the big data from Bank of Central Asia (BCA) and Bank of Mandiri from Indonesia obtained from Yahoo finance. In our experiments, we visualize the data using data science and predict and simulate the important prices called Open, High, Low and Closing (OHLC) with various parameters. Conclusions Based on the experiment, data science is very useful for visualization data and our proposed method using Long Short-Term Memory (LSTM) can be used as predictor in short term data with accuracy 94.57% comes from the short term (1 year) with high epoch in training phase rather than using 3 years training data.
- Published
- 2021
8. Adaptive Goal Function of Ant Colony Optimization in Fake News Detection
- Author
-
Barbara Probierz, Piotr Stefański, Jan Kozak, and Przemysław Juszczuk
- Subjects
business.industry ,Computer science ,Ant colony optimization algorithms ,media_common.quotation_subject ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,Machine learning ,computer.software_genre ,Class (biology) ,Term (time) ,Disinformation ,Selection (linguistics) ,The Internet ,Quality (business) ,Artificial intelligence ,business ,Precision and recall ,computer ,media_common - Abstract
Currently, there is a very rapid growth of information published on the Internet, both on social media and news sites. However, a serious problem is a disinformation in the form of fake news. Due to the rapid spread of information on the Internet, it is very important to be able to quickly identify true and fake news. The solution to this problem can be an initial analysis of news by its title and quick selection of true or fake news. Additionally, the possibility of balancing precision and recall as the quality of classification measures could allow for better news selection. In this paper, we propose the use of the adaptive goal function of ant colony optimization algorithms in fake news detection. The goal of this solution is to increase recall or precision of the selected class – in this case fake or true news. We use natural language processing (NLP) to describe the title of the news. In addition, a constrained term matrix is used. The choice of titles alone and the restriction of the words analyzed are related to speeding up the initial classification. Eventually, we present an analysis of a real dataset and classification results (detailing recall and precision) of news using the adaptive goal function of the ACDT algorithm.
- Published
- 2021
9. Predictive Intelligence at a Glance
- Author
-
Uwe Seebacher
- Subjects
Presentation ,Capability Maturity Model ,business.industry ,Computer science ,media_common.quotation_subject ,Compact form ,Artificial intelligence ,business ,media_common ,Term (time) - Abstract
This chapter provides a compact overview and an introductory presentation of the term Predictive Intelligence. It is described how Predictive Intelligence can be established in an organization step by step. In order to be able to analyze the respective initial situation in their organization, the maturity model to Predictive Intelligence is described just like the analysis instrument. At the end of the section, the advantages of Predictive Intelligence (PI) and, above all, the sustainable effects are discussed in a compact form.
- Published
- 2021
10. AI-Based Voice Assistants Technology Comparison in Term of Conversational and Response Time
- Author
-
Yusuph J. Koni, Seyitmammet Alchekov Saparmammedovich, Hoon-Jae Lee, and Mohammed Abdulhakim Al-Absi
- Subjects
World Wide Web ,Mobile phone ,Computer science ,media_common.quotation_subject ,Natural (music) ,Response time ,Conversation ,Voice command device ,User interface ,Field (computer science) ,Term (time) ,media_common - Abstract
With the rapid development of artificial intelligence (AI) technology, the field of mobile phone technology is also developing in the direction of using artificial intelligence. Natural user interfaces are becoming popular. One of the most common natural users interfaces nowadays are voice activated interfaces, particularly smart personal assistants such as Google Assistant, Alexa, Bixby, and Siri. This paper presents the results of an evaluation of these four smart personal assistants in two hypothesis: The first hypothesis is presents the results of an evaluation of these digital assistants in how the application is very good in interacting based on conversation, means in how the app can be able to continue answer questions in depending on the previous answer, and the second hypothesis is to measure how long it take for an app to respond on those conversasion, doesn’t matter if the app was able to keep on conversation or not. To determine this conversation and time taken on those conversation, four apps are install into four Samsung smartphone and defferent continuous question was asked after completed an answer previous question at the same time to all voice assistance. The overall result shows that, Google assistant was able to keep answer question by depending the previous answer without mis any question compare to the other digital assistants respectively but is very slowly app in responding to those conversation. Siri mis to keep on conversation on few question compare to Bixby and Alexa but is first app to respond to those conversation. Bixby was able to keep on conversation only in mathematical question but in other question was very poor but in case of time taken to respond is the thirsd app after Alexa. Alexa keeps on conversation in some question more than Bixby and in term of time taken to respond Alexa is second after Siri.
- Published
- 2021
11. Effects of Resources (Time, Money, Income, and Wealth) on Wellbeing
- Author
-
M. Joseph Sirgy
- Subjects
Quality of life (healthcare) ,media_common.quotation_subject ,Happiness ,Life satisfaction ,Meaning (existential) ,Psychology ,Level of analysis ,Eudaimonia ,Social psychology ,Savoring ,media_common ,Term (time) - Abstract
This chapter discusses the effects of resources such as time, money, income, and wealth on subjective aspects of quality of life (hedonic wellbeing, life satisfaction, and Eudaimonia). With respect to time, much research in this area attempts to answer questions such as, what activities people spend time to produce the greatest happiness, how savoring time can produce happiness, and how people extract meaning from time. With respect to money, the research highlights those things that people spend money to enhance their wellbeing. Also, the research addresses conditions that shed light on the wellbeing effects of time versus money. Followingly, research dealing with the effects of income and wealth on wellbeing is discussed. This discussion is broken down by level of analysis (individual versus national) and time frame (short term versus long term).
- Published
- 2021
12. The Role of Fluency and Dysfluency in Metacognitive Experiences
- Author
-
Bennett L. Schwartz and Andreas Jemstedt
- Subjects
Fluency ,media_common.quotation_subject ,Perspective (graphical) ,Illusion ,Metacognition ,Psychology ,Control (linguistics) ,Self-regulated learning ,media_common ,Cognitive psychology ,Term (time) - Abstract
Metacognition is a broad term that means different things to researchers in different sub-areas. A major contribution of Anastastia Efklides is to bring together disparate approaches in metacognition under one theoretical perspective. In this paper, we examine the concept of fluency and how it has been employed in metacognition research. Fluency-based judgments are generally considered to be the primary source of inaccuracy of metacognitive judgments as well as the primary reason why metacognitive control goes astray in self-regulated learning. We discuss how and when fluent processing influences metacognition, when fluency leads to accurate judgments and when it leads to illusions.
- Published
- 2021
13. Exponential Technologies and Future Scenarios
- Author
-
Giovanni Cacciamani, Nicola Marino, and Domenico Veneziano
- Subjects
Computer science ,business.industry ,media_common.quotation_subject ,Big data ,Business model ,Deception ,Term (time) ,Exponential growth ,Disruptive innovation ,Quality (business) ,business ,Industrial organization ,Digitization ,media_common - Abstract
The development of technologies in recent years has pervaded any sector radically transforming business models and workflows. At the base of this dramatic change is the exponential growth that began in the 1960s and theorized by Gordon Moore, according to which the complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months. The “Exponential technologies” neologism is born on the basis of the “exponential” character associated with the intrinsic meaning of the term “technologies”, as tools that allow you to achieve goals and solve problems that were not attainable until now due to a high ratio between price and quality. These technologies, developed in specific sectors, have acquired over the years the character of “disruptive innovation”, that is, they have occupied new sectors, creating new markets, breaking down old paradigms by virtue of their high effectiveness and cost ratio. At the base of the “exponential technologies” there are also some common characteristics that distinguish their growth and diffusion, as well as the potential of these to have a positive impact on 1 billion people in the world in 10 years; Deception, Digitization, Demonetized, Dematerialized, Democratize.
- Published
- 2021
14. Evaluating Task-General Resilience Mechanisms in a Multi-robot Team Task
- Author
-
James Staley, Matthias Scheutz, Tufts University [Medford], Ilias Maglogiannis, John Macintyre, Lazaros Iliadis, TC 12, and WG 12.5
- Subjects
Resilience ,Computer science ,media_common.quotation_subject ,Long-term autonomy ,Intelligent decision support system ,0102 computer and information sciences ,02 engineering and technology ,computer.software_genre ,01 natural sciences ,Term (time) ,Task (project management) ,Intelligent agent ,010201 computation theory & mathematics ,Human–computer interaction ,Intelligent autonomous agents ,0202 electrical engineering, electronic engineering, information engineering ,Robot ,[INFO]Computer Science [cs] ,020201 artificial intelligence & image processing ,Set (psychology) ,Function (engineering) ,Resilience (network) ,computer ,media_common - Abstract
Part 5: Autonomous Agents; International audience; Real-word intelligent agents must be able to detect sudden and unexpected changes to their task environment and effectively respond to those changes in order to function properly in the long term. We thus isolate a set of perturbations that agents ought to address and demonstrate how task-agnostic perturbation detection and mitigation mechanisms can be integrated into a cognitive robotic architecture. We present results from experimental evaluations of perturbation mitigation strategies in a multi-robot system that show how intelligent systems can achieve higher levels of autonomy by explicitly handling perturbations.
- Published
- 2021
15. Scope and Conditional Functions
- Author
-
John Hunt
- Subjects
Theoretical computer science ,Scope (project management) ,Block (programming) ,Computer science ,media_common.quotation_subject ,Code (cryptography) ,Context (language use) ,Object (computer science) ,Function (engineering) ,Expression (mathematics) ,Term (time) ,media_common - Abstract
Kotlin provides five functions whose sole purpose is to execute a block of code within the context of an instance or an object. When you call such a function on an instance with a lambda expression, it forms a temporary scope (hence the term scope function). In this scope, you can access the instance directly typically using the this reference or in two cases using the it implicit parameter.
- Published
- 2021
16. Learning Disentangled Representations with the Wasserstein Autoencoder
- Author
-
David Barber, Benoit Gaujac, and Ilya Feige
- Subjects
Flexibility (engineering) ,Computer science ,business.industry ,media_common.quotation_subject ,Fidelity ,Latent variable ,Machine learning ,computer.software_genre ,Autoencoder ,Term (time) ,Total correlation ,Artificial intelligence ,business ,Representation (mathematics) ,computer ,Feature learning ,media_common - Abstract
Disentangled representation learning has undoubtedly benefited from objective function surgery. However, a delicate balancing act of tuning is still required in order to trade off reconstruction fidelity versus disentanglement. Building on previous successes of penalizing the total correlation in the latent variables, we propose TCWAE (Total Correlation Wasserstein Autoencoder). Working in the WAE paradigm naturally enables the separation of the total-correlation term, thus providing disentanglement control over the learned representation, while offering more flexibility in the choice of reconstruction cost. We propose two variants using different KL estimators and perform extensive quantitative comparisons on data sets with known generative factors, showing competitive results relative to state-of-the-art techniques. We further study the trade off between disentanglement and reconstruction on more-difficult data sets with unknown generative factors, where we expect improved reconstructions due to the flexibility of the WAE paradigm.
- Published
- 2021
17. Corporate Environmental Responsibility and Capital Structure
- Author
-
Panagiotis Dimitropoulos and Konstantinos Koronios
- Subjects
Corporate environmental responsibility ,Capital structure ,Shareholder ,Debt ,media_common.quotation_subject ,Economics ,Equity (finance) ,Monetary economics ,Endogeneity ,Term (time) ,media_common ,Differential impact - Abstract
The aim of this chapter is to examine the impact of CER related performance on firm capital structure decisions, by distinguishing between short-term debt, long term debt and common equity capital and taking into consideration potential endogeneity between the examined factors and their bidirectional association, evidenced by previous studies on the field. Empirical analysis suggested that CER performance is negatively impacting long term debt but positively on short term debt and shareholders’ equity indicating a differential impact of CER on capital structure decisions. Practically, we can argue the CER performing firms rely more on their own shareholders’ equity and short term debt to finance their activities, and less on long term debt.
- Published
- 2021
18. Towards Dark Jargon Interpretation in Underground Forums
- Author
-
XiaoFeng Wang, ChengXiang Zhai, Dominic Seyler, and Wei Liu
- Subjects
Vocabulary ,Computer science ,business.industry ,Interpretation (philosophy) ,media_common.quotation_subject ,computer.software_genre ,GeneralLiterature_MISCELLANEOUS ,Term (time) ,Jargon ,Identification (information) ,Simulated data ,Artificial intelligence ,business ,computer ,Natural language processing ,media_common ,Meaning (linguistics) - Abstract
Dark jargons are benign-looking words that have hidden, sinister meanings and are used by participants of underground forums for illicit behavior. For example, the dark term “rat” is often used in lieu of “Remote Access Trojan”. In this work we present a novel method towards automatically identifying and interpreting dark jargons. We formalize the problem as a mapping from dark words to “clean” words with no hidden meaning. Our method makes use of interpretable representations of dark and clean words in the form of probability distributions over a shared vocabulary. In our experiments we show our method to be effective in terms of dark jargon identification, as it outperforms another baseline on simulated data. Using manual evaluation, we show that our method is able to detect dark jargons in a real-world underground forum dataset.
- Published
- 2021
19. Causes and Consequences of Child Labor
- Author
-
Isidro Maya Jariego
- Subjects
Empirical research ,Poverty ,media_common.quotation_subject ,Immigration ,Demographic economics ,Truancy ,Compulsory education ,Family history ,Employability ,Psychology ,media_common ,Term (time) - Abstract
In this chapter, we examine the causes and consequences of child labor. Working children usually come from low-income families and, among other possible outcomes, there is a high risk that they will drop out of compulsory education. The most relevant and contrasting background factor is poverty. Other empirically documented causes are family size, immigration, family history of child labor, and public attitudes. Among the consequences is the educational impact. It is generally assumed that child labor also has an impact on the health and psychological well-being of children, and in the long term, on employability and working conditions during adulthood. However, empirical research on these negative effects is significantly less developed.
- Published
- 2021
20. Estuarine Environmental Monitoring Programs: Long-Term Studies
- Author
-
Noelia Soledad la Colla, Ana Carolina Ronda, Gabriela Blasina, Vanesa Lorena Negrin, Andrea Lopez Cazorla, Sandra Marcela Fiori, Sandra Elizabeth Botté, Andres Hugo Arias, Silvia G. De Marco, María Amelia Cubitto, Mónica Diana Baldini, Pía Simonetti, Juan Manuel Molina, Analia Veronica Serra, Marcelo Tomas Pereyra, Jorge Eduardo Marcovecchio, and Ana Laura Oliva
- Subjects
Computer science ,business.industry ,media_common.quotation_subject ,Environmental resource management ,Environmental monitoring ,Quality (business) ,Environmental systems ,business ,Term (time) ,media_common - Abstract
Monitoring programs are evaluation procedures designed to assess the state and evolution of a system over time. In particular, environmental monitoring programs are those that are applied to assess the condition and quality of an ecosystem (or environmental system of the corresponding level) in a given period. Different types of monitoring programs have opportunely been characterized, which are used in different circumstances and situations. Long-term monitoring programs have numerous advantages, and their results are those that allow us to draw the best conclusions as well as provide the best recommendations to the corresponding decision makers. Historical background of the application of this type of environmental monitoring program in different parts of the world is reviewed. Finally, the application of long-term environmental monitoring within the Bahia Blanca Estuary is analyzed as a particular study case.
- Published
- 2021
21. Understanding the Fundamental Principles of Ecosystems through a Global Network of Long-Term Ecological Research Sites
- Author
-
Kristin Vanderbilt and Robert B. Waide
- Subjects
Strategic planning ,Scarcity ,Information management ,Ecology ,Political science ,media_common.quotation_subject ,Corporate governance ,Global network ,Climate change ,Ecosystem ecology ,media_common ,Term (time) - Abstract
The Long Term Ecological Research (LTER) Network served as a catalyst to promote cooperation among multi-national research programs and networks. The chapter describes the strategic planning process in the 1990s that led to the creation of the International Long Term Ecological Research (ILTER) Network, which expanded rapidly in the late-1990s with support of the National Science Foundation (NSF). Especially under the leadership of James Gosz, cooperative arrangements were made with countries worldwide and 23 LTER networks had formed by 2003. Many U.S. LTER scientists and information management specialists contributed to the expansion of the ILTER Network through site visits and workshops. After NSF scaled back its support, ILTER reorganized its governance structure and grew into a robust, self-sustaining network of networks. The chapter reviews examples of collaborations in research and information management. The ILTER Network today aims to become part of a global infrastructure to address continental and global socio-ecological problems through partnerships with other international networks. Concerns about climate change, biodiversity loss, and the scarcity of research sites producing long-term data led to the creation of the U.S. LTER Network and hence ILTER, and the legacies of data, information, and long-term measurements that have resulted are a critical contribution to resolving current global problems.
- Published
- 2021
22. Conceptualising Hidden Geographies
- Author
-
Marko Krevs
- Subjects
Geoinformatics ,Perception ,media_common.quotation_subject ,Scientific literature ,USable ,Epistemology ,media_common ,Term (time) ,Variety (cybernetics) - Abstract
After several decades of rather sporadic use in the scientific literature, the concept of hidden geographies is still usually based on provisional definitions that support the specific geographical hiddenness of the topic presented in a publication. This chapter focuses on hidden geographies, with the aim of providing a usable, not necessarily definitive understanding and definition of the concept. After a conceptual-semantic view at hidden geographies, the meanings of the concept and the term are presented, based on the analysis of literature, which provides a colourful variety of connotations and names of the concept in practise. In the discussion, some of the contexts underlying the concept under study are highlighted, as well as questions regarding its understanding and use, such as understanding the blurred line between hidden and revealed geography, and the roles of geography and geoinformatics in revealing or hiding geographies. Finally, a general definition and some specific definitions are proposed, linked to four layers of understanding of the concept: undiscovered, uncognised, unpublished and deliberately hidden geographies.
- Published
- 2021
23. The Big Bang Theory
- Author
-
Emilio Elizalde
- Subjects
Big Bang ,History ,Expression (architecture) ,media_common.quotation_subject ,Context (language use) ,Universe ,media_common ,Metric expansion of space ,Simple (philosophy) ,Term (time) ,Epistemology - Abstract
We are now entering a crucial chapter, the most original one for the new interpretations given, and also the most difficult chapter to understand in this book. If the expansion of the universe is no longer easy to accept, getting to understand its origin in any detail is even harder. That alone can explain the enormous amount of inaccuracies and crazy absurdities that have been written, and continue to be written every day, on this issue. Many of the explanations that we can currently find in the media cling to the one that Lemaitre gave 90 years ago. This is amazing. But he never said “Big Bang”! Referring again to the purpose of this book, we will delve in this chapter into the depths of the true origins of the term “Big Bang.” When did this expression first appear? What exactly did the person who first spoke these two words mean? Why did he employ this term and not another one? And, in what context was it used? Actually, this chapter, and with it the whole topic, does not begin abruptly, but rather as a very smooth, absolutely logical continuation of the previous chapter. Having already accepted that the universe was expanding, it was enough for Lemaitre to turn his head and look back in time, towards the past. It was that simple: in the remote past, the whole universe must necessarily have been concentrated inside a highly dense nutshell.
- Published
- 2021
24. An Econometric Study of the Term Structure and the Real Economy
- Author
-
David S. Bywaters and D. Gareth Thomas
- Subjects
Expectancy theory ,Work (electrical) ,media_common.quotation_subject ,Econometrics ,Economics ,Inversion (meteorology) ,Loanable funds ,Recession ,Mechanism (sociology) ,media_common ,Interest rate ,Term (time) - Abstract
This completely new chapter investigates U.S.A. data on GDP growth, interest rates and expectations from Livingstone and Michigan surveys using econometrics to show evidence that the expectations theory could be the basis of term structure. The work goes on to explore how well differences in U.S.A. interest rates at different maturities explain up-coming GDP growth, and to show the importance of the empirical inversion of interest-bearing assets in determining subsequent states of the economy and the risk of recession over the course of the loanable funds cycle. In a sense, this work explores the monetary policy transmission mechanism in the U.S.A. It is all entirely new to the book.
- Published
- 2021
25. An Alternative Model: Stabilizing the Long-Term Interest Rate
- Author
-
Burkhard Wehner
- Subjects
media_common.quotation_subject ,Inflation rate ,Money supply ,Monetary policy ,Economics ,Monetary theory ,Limit (mathematics) ,Monetary economics ,Potential output ,Term (time) ,Interest rate ,media_common - Abstract
The search for a new conception of monetary policy must begin with admitting that monetary theory that derives policy recommendations from traditional variables such as money supply, the current inflation rate, potential output, or even nominal GDP has reached its limit.
- Published
- 2021
26. Abduction and the Logic of Inquiry: Modern Epistemic Vectors
- Author
-
Jay Schulkin
- Subjects
Pragmatism ,Range (mathematics) ,Computer science ,media_common.quotation_subject ,Inference ,Lexicon ,media_common ,Statistical hypothesis testing ,Term (time) ,Epistemology - Abstract
CS Peirce introduced the concept of abduction into our epistemic lexicon. It is a view of problem solving that emphasizes ecological contexts, preparatory or predictive predilection knotted to learning and inquiry. Abduction is essentially tied more broadly to pragmatism. One view of the brain reflects the fact that predictive predilections knotted to abduction or hypothesis testing dominates the landscape of diverse forms of problem solving. Abduction is biologically constrained and contextual, not a monolithic term and runs the range of neural capability.
- Published
- 2021
27. The Hunt for the Simplest Possible Vocabulary: Minimal Finnish Meets Easy Finnish
- Author
-
Ulla Vanhatalo and Leealaura Leskelä
- Subjects
Vocabulary ,ComputingMilieux_THECOMPUTINGPROFESSION ,Grammar ,Computer science ,Text simplification ,media_common.quotation_subject ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Context (language use) ,Lexicon ,Linguistics ,Term (time) ,Test (assessment) ,Language form ,media_common - Abstract
In the current European context, the term Easy Language (previously easy-to-read language) refers to a language form that has been modified to be easier to read and understand for people with various linguistic challenges. This empirical pilot study combined Minimal Finnish and Easy Finnish, investigating whether Minimal Finnish could provide core vocabulary for Easy Finnish. Easy Finnish experts adapted a Standard Finnish text on discrimination, using both conventional Easy Finnish guidelines and the 300-word lexicon of Minimal Finnish, then presented the texts to test readers. The study found that Minimal Finnish is beneficial for explaining abstract concepts, and the semantic primes and molecules may be considered a list of ‘safe’ words. However, it can also give rise to more complex grammar or lengthen the text.
- Published
- 2021
28. The Welfarist Account of Disenhancement as Applied to Nonhuman Animals
- Author
-
Adam Shriver
- Subjects
media_common.quotation_subject ,030229 sport sciences ,06 humanities and the arts ,Bioethics ,0603 philosophy, ethics and religion ,Term (time) ,03 medical and health sciences ,0302 clinical medicine ,Key (cryptography) ,Economics ,Criticism ,060301 applied ethics ,Affect (linguistics) ,Positive economics ,Welfare ,media_common - Abstract
I criticize the current usage of the terms “enhancement” and “disenhancement” in the debate over the genetic modification of animals and propose an alternative definition of these terms based on how modifications affect animals’ welfare in particular contexts. The critique largely follows a similar criticism of the use of the term “enhancements” in the human bioethics literature. I first describe how the term “disenhancement” has been used in debates thus far, and argue that the present lack of a shared definition is problematic. I then consider some potential definitions of “disenhancement” that can be adapted from the human bioethics literature and argue that most of these uses are flawed for the purposes of using the term in current ethical debates. Finally, I elaborate on the welfarist conception of disenhancement and consider some potential objections, using examples from the literature to illustrate key points.
- Published
- 2021
29. Efficient Mixing of Arbitrary Ballots with Everlasting Privacy: How to Verifiably Mix the PPATC Scheme
- Author
-
Thomas Haines, Kristian Gjøsteen, and Morten Rotvold Solberg
- Subjects
Scheme (programming language) ,021110 strategic, defence & security studies ,Computer science ,business.industry ,Electronic voting ,media_common.quotation_subject ,0211 other engineering and technologies ,Homomorphic encryption ,02 engineering and technology ,Computer security ,computer.software_genre ,Encryption ,Term (time) ,Voting ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,computer ,ElGamal encryption ,media_common ,Quantum computer ,computer.programming_language - Abstract
The long term privacy of voting systems is of increasing concern as quantum computers come closer to reality. Everlasting privacy schemes offer the best way to manage these risks at present. While homomorphic tallying schemes with everlasting privacy are well developed, most national elections, using electronic voting, use mixnets. Currently the best candidate encryption scheme for making these kinds of elections everlastingly private is PPATC, but it has not been shown to work with any mixnet of comparable efficiency to the current ElGamal mixnets. In this work we give a paper proof, and a machine checked proof, that the variant of Wikstrom’s mixnet commonly in use is safe for use with the PPATC encryption scheme.
- Published
- 2021
30. Mobile Multitasking in Urban Contexts: Habituation and Countermeasures
- Author
-
Pierre-Majorique Léger, Ryad Titah, and Zoubeir Tkiouat
- Subjects
Motor task ,Conceptual framework ,Human–computer interaction ,Computer science ,media_common.quotation_subject ,Human multitasking ,ComputerSystemsOrganization_SPECIAL-PURPOSEANDAPPLICATION-BASEDSYSTEMS ,Context (language use) ,Habit ,Habituation ,Term (time) ,media_common - Abstract
With the increase in adoption of mobile electronic devices such as smartphones, it becomes more and more important to address the risks associated with their use. As such, this research paper addresses the behavior and habit formation of mobile multitasking, i.e., the use of a mobile IT device while performing a motor task such as walking, and its negative impacts on the individual’s performance and safety. To have a better understanding on how different countermeasures impact the behavior of mobile multitasking in the short term as well as the habit of mobile multitasking in the long term, the present research paper introduces a classification of the different countermeasures that are put in place to curb the risks of mobile IT multitasking in an urban context. Then, it proposes a conceptual framework that explains the mechanisms and the impacts of the deterrent countermeasures as well as the preventive countermeasures on both the behavior of mobile multitasking and the habit formation of mobile multitasking.
- Published
- 2021
31. Instead of Conclusions: Short- and Long-Term Scenarios for Media Regulation
- Author
-
Franck Rebillard, Larry Kilman, Françoise Benhamou, Iva Nenadić, Marko Milosavljević, Sorin Adam Matei, Nicolas Curien, and Maud Bernisson
- Subjects
Deregulation ,History ,Media regulation ,media_common.quotation_subject ,Quality (business) ,Convergence (relationship) ,Pessimism ,Marketing ,Monopoly ,Term (time) ,media_common - Abstract
Those who do not study history are doomed to repeat it, but much the same can be said of those who fail to prepare for the future. In this chapter, both the long and short view of the future are taken by our panel of contributors: short term outlooks often revolve around the media titans of today—GAFAM (Google, Apple, Facebook. Amazon, Microsoft), which according to Benhamou, cannot be ignored. Long term outlooks include both positivity—Curien—and pessimism, via Bernisson. Of course, these forecasting exercises will only meet reality halfway. The future is as unpredictable as the past is unbelievable.
- Published
- 2021
32. Contracts in Project Finance
- Author
-
Farid Mohamadi
- Subjects
Finance ,Shareholders' agreement ,Negotiation ,business.industry ,media_common.quotation_subject ,Value (economics) ,Project finance ,Revenue ,Asset (economics) ,Business ,Power purchase agreement ,media_common ,Term (time) - Abstract
Renewable energy infrastructure are multimillion long-term investments that involve dozens of extensive studies of all sort, numerous stakeholders, public and private, each of them bund to the project company and to each other through one to several agreements. One could prudently say that the complexity of the agreements goes with the importance of what is at stake in the agreement, in other words, the bigger the stake, the more complex the contract. In Project Finance, these contracts are one of the most relevant pieces of the financing puzzle, since they represent together with the obtained permits and licenses the “asset” of the project company. If a power purchase agreement (PPA), for instance, has good conditions relative to the average market conditions, e.g., a higher offtake price or a longer term, it will create more “asset” value to the project. This complexity requires the participation of experts, advisors, lawyers, financiers, promoters among others who will discuss and negotiate the contract terms from their perspective. All agreements are relevant, however, some of them, because of the high impact they might have on the future revenues stream of the project, require special attention and care from the contracting parties. The offtake contract, the EPC and O&M contracts, the shareholders agreement, and the share purchase agreement as well as the facility agreement clearly belong to this category and will be thoroughly analyzed.
- Published
- 2021
33. Assessing the Impact of Haze on Solar Energy Potential Using Long Term PM2.5 Concentration and Solar Insolation Filed Data in Naples, Italy
- Author
-
V. Manna, Massimiliano Fabbricino, G. Sorrentino, Grazia Fattoruso, M. Nocerino, G. Di Francia, and S. De Vito
- Subjects
Pollutant ,Pollution ,Haze ,business.industry ,020209 energy ,media_common.quotation_subject ,Photovoltaic system ,0211 other engineering and technologies ,Air pollution ,02 engineering and technology ,Atmospheric sciences ,medicine.disease_cause ,Solar energy ,Term (time) ,Atmosphere ,021105 building & construction ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,Environmental science ,business ,media_common - Abstract
Atmospheric fine particulate pollutant affects seriously the human health but also the passage of light through the lower atmosphere, reducing the solar radiation reaching the ground as well as the PV panels. In this study, the solar insolation reduction due to air pollution has been investigated in the city of Naples (Italy). Analyzing local long term field data, we have obtained that the solar insolation reduction is exponentially correlated to the PM2.5 concentration. By using the derived empirical relation, for Naples it was estimated that the solar insolation was reduced around 5% or 66.20 kWh/m−2 within one-year period (May 2018–May 2019), due to air fine particulate pollution. This study provides the theoretical basis to design successful solar PV systems to be mounted on building rooftops or in other suitable sites, taking into account also the local air pollution condition.
- Published
- 2021
34. Modified Control Charts Monitoring Long-Term Semiconductor Manufacturing Processes
- Author
-
Rafael H. Furukawa, Hingmar A. HenriquesJr., Giovanni Moura de Holanda, and Jorge Moreira de Souza
- Subjects
Semiconductor industry ,Computer science ,Semiconductor device fabrication ,media_common.quotation_subject ,Process efficiency ,Quality (business) ,Control chart ,Statistical process control ,Quality policy ,Term (time) ,media_common ,Reliability engineering - Abstract
Statistical methods have been systematically adopted in efforts to improve process efficiency and semiconductor manufacturing quality. Some modeling advances have been provided to meet business peculiarities and specific characteristics of this industry. This paper presents an approach to address long-term Pp (Ppk) indices using a modified control chart. The idea is to apply this model to dynamically obtain the ppm (parts per million) and the acceptance limits given some parameters: long-term target, the probability of rejecting a good item, and the upper and lower specification limits. In a way that the cost of identifying and correcting special causes should be compatible with the cost of off-target products and with the corporate quality policy. Such an approach was developed to integrate a statistical module for monitoring and controlling of equipment set-up of a Brazilian semiconductor manufacturing company.
- Published
- 2021
35. Randomised Controlled Cross-Over Trial Measuring Brain-Computer Interface Metrics to Characterise the User Experience of Search Engines When Ambiguous Search Queries Are Used
- Author
-
Lizette De Wet, Wynand Nel, and Robert Schall
- Subjects
Information retrieval ,Computer science ,business.industry ,media_common.quotation_subject ,Interface (computing) ,String searching algorithm ,Ambiguity ,Term (time) ,Search engine ,User experience design ,The Internet ,business ,media_common ,Brain–computer interface - Abstract
Retrieving information through World Wide Web searches is part of daily life. Many people prefer using short search strings/queries which can be ambiguous because of their brevity. This ambiguity often causes search engines to return thousands of irrelevant results which can cause frustration with the particular search engine. Consequently, users might rate the particular search engine unfavourably. We conducted a randomised controlled cross-over trial with a Graeco-Latin Square design to measure various user emotions (Frustration, Excitement, Meditation and Engagement) with a Brain-Computer Interface while participants performed ambiguous Internet searches in Google, Yahoo! and Bing. The study results suggest that emotion data captured with a Brain-Computer Interface together with the pre-test and post-test questionnaire feedback can be used to characterise the user experience of different search engines when searches are conducted using ambiguous search terms. In particular, the effect of Search Engine and Search Term had a significant outcome on the measured emotions, while the effect of Occasion was not significant.
- Published
- 2021
36. Long-Term Exploration in Persistent MDPs
- Author
-
Aleksandr I. Panov, Alexey Skrynnik, and Leonid Ugadiarov
- Subjects
Computer science ,business.industry ,media_common.quotation_subject ,Test (assessment) ,Term (time) ,Curiosity ,Reinforcement learning ,State space ,Domain knowledge ,Quality (business) ,Markov decision process ,Artificial intelligence ,business ,media_common - Abstract
Exploration is an essential part of reinforcement learning, which restricts the quality of learned policy. Hard-exploration environments are defined by huge state space and sparse rewards. In such conditions, an exhaustive exploration of the environment is often impossible, and the successful training of an agent requires a lot of interaction steps. In this paper, we propose an exploration method called Rollback-Explore (RbExplore), which utilizes the concept of the persistent Markov decision process, in which agents during training can roll back to visited states. We test our algorithm in the hard-exploration Prince of Persia game, without rewards and domain knowledge. At all used levels of the game, our agent outperforms or shows comparable results with state-of-the-art curiosity methods with knowledge-based intrinsic motivation: ICM and RND. An implementation of RbExplore can be found at https://github.com/cds-mipt/RbExplore.
- Published
- 2021
37. Chaotic Time Series Prediction: Run for the Horizon
- Author
-
Vasilii A. Gromov
- Subjects
Set (abstract data type) ,Exponential growth ,Computer science ,media_common.quotation_subject ,Context (language use) ,Time series ,Cluster analysis ,Infinity ,Algorithm ,Chaotic time series prediction ,Term (time) ,media_common - Abstract
The present article reviews some recent papers concerned with chaotic time series prediction in the context of predictive clustering, and discusses in greater detail some novel techniques designed to avoid ‘a curse of exponential growth’ – errors grow exponentially depending on the number of steps ahead to be predicted. These techniques are non-successive observations, combined with a prognosis that employs already predicted values, the concept of non-predictable points, and a quality assessment of clusters used. The approach discussed, allows one to separate calculation into two parts: the first part, essentially larger, is performed off-line, the second, immediate prediction routine, is carried out on-line. This makes it possible to design fast and efficient prediction algorithms. A wide-ranging simulation, suggests that the error term associated with the prediction sub-model used, provided that clusters used to predict are chosen correctly, vanishes as the validation set size grows to infinity. Similarly, the error term associated with an incorrect choice of clusters used to predict, decreases when a validation set size increases.
- Published
- 2021
38. Designing an Arts-Based, Collaborative Mystery Game to Improve Players’ Motivation and Confidence as Storytellers
- Author
-
Simone Downie
- Subjects
Co-design ,media_common.quotation_subject ,Reading (process) ,Pedagogy ,ComputingMilieux_PERSONALCOMPUTING ,Narrative ,Content (Freudian dream analysis) ,Psychology ,The arts ,Literacy ,Term (time) ,Storytelling ,media_common - Abstract
Literacy is a complex term that involves an array of intertwined processes, including motivation. Students who lack this internal willingness often fail to engage with books and other literacy-related content, which in turn negatively effects their learning outcomes. Murder on Mansion Hill is a digital tabletop game that weaves together arts-based strategies shown to improve students’ involvement in reading - including storytelling, collage, and co-design - in a cohesive manner. By blending these strategies with the motivational benefits of gameplay, Mansion Hill encourages players to take a highly active role in developing and sharing imaginative narratives, helping them to view stories as an enjoyable and rewarding outlet they can share with friends. In playtests, players overcame fear of judgment, enjoyed engaging with narrative in multimodal ways, and bonded with peers.
- Published
- 2021
39. Performance Analysis of the Fireworks Algorithm Versions
- Author
-
Nebojsa Bacanin, Ivana Strumberger, Ira Tuba, Milan Tuba, and Eva Tuba
- Subjects
Dynamic search ,Optimization problem ,Optimization algorithm ,Fireworks algorithm ,Computer science ,business.industry ,media_common.quotation_subject ,Machine learning ,computer.software_genre ,Swarm intelligence ,Term (time) ,Quality (business) ,Artificial intelligence ,business ,Metaheuristic ,computer ,media_common - Abstract
In the last decades, swarm intelligence algorithms have become a powerful tool for solving hard optimization problems. Nowadays numerous algorithms are proved to be good for different problems. With the overwhelming number of algorithms, it became hard for a common user to choose an appropriate method for solving a certain problem. To provide guidelines, it is necessary to classify optimization metaheuristics according to their capabilities. Deep statistical comparison represents a novel method for comparing and analyzing optimization algorithms. In this paper, the deep statistical comparison method was used for comparing different versions of the widely used fireworks algorithm. The fireworks algorithm was developed and improved in the last ten year, and this paper provides a theoretical analysis of five different versions, a cooperative framework for FWA, bare bones FWA, guided FWA, loser-out tournament based FWA, and dynamic search FWA. Based on the obtained results, the loser-out tournament based FWA has the best performance in the term of the solution quality, while the dynamic search FWA is the best in term of the solutions distribution in the search space.
- Published
- 2021
40. Critical Web Archive Research
- Author
-
Anat Ben-David
- Subjects
Computer science ,business.industry ,Web archiving ,media_common.quotation_subject ,Best practice ,computer.file_format ,WAR ,Term (time) ,World Wide Web ,Exceptionalism ,Software ,Perception ,Premise ,business ,computer ,media_common - Abstract
Following the familiar distinction between software and hardware, this chapter argues that web archives deserve to be treated as a third category—memoryware: specific forms of preservation techniques which involve both software and hardware, but also crawlers, bots, curators, and users. While historically the term memoryware refers to the art of cementing together bits and pieces of sentimental objects to commemorate loved ones, understanding web archives as a complex socio-technical memoryware moves beyond their perception as bits and pieces of the live Web. Instead, understanding web archives as memoryware hints at the premise of the web’s exceptionalism in media and communication history and calls for revisiting some of the concepts and best practices in web archiving and web archive research that have consolidated over the years. The chapter, therefore, presents new challenges for web archive research by turning a critical eye on web archiving itself and on the specific types of histories that are constructed with web archives.
- Published
- 2021
41. Negotiating Techniques in Import-Export Trade
- Author
-
Charles Chatterjee
- Subjects
Negotiation ,business.industry ,media_common.quotation_subject ,Sovereign power ,Misnomer ,International trade ,business ,Public body ,Term (time) ,media_common - Abstract
The term “international trade” is a misnomer, although this term is regularly used by lawyers without realising that it, in reality, does not exist; the correct term for it would be “transnational trade.” Research suggests that rich countries may be as risky as poor countries—only the nature of risk is different. Of course, if a public body in a country becomes a trading partner, it may always create risks by applying its sovereign power, but it does not happen too often.
- Published
- 2021
42. Long Wavelength Asymptotics of Self-Oscillations of Viscous Incompressible Fluid
- Author
-
S. V. Revina
- Subjects
Physics ,Long wavelength ,Flow (mathematics) ,Basic solution ,media_common.quotation_subject ,Mathematical analysis ,Viscous incompressible fluid ,Series expansion ,Infinity ,Shear flow ,Term (time) ,media_common - Abstract
We obtain the long wavelength asymptotics of a secondary regime formed at stability loss of a stationary spatially periodic shear flow with non-zero average as one of the spatial periods tends to infinity (the wave number vanishes). It is known that if certain non-degeneracy conditions are satisfied, then from the basic solution a self-oscillatory regime branches. Recurrence formulas for kth term of the asymptotics of this secondary solution are obtained. To study the bifurcations of basic flow we obtain the scheme of Lyapunov-Schmidt method proposed by V.I. Yudovich. At each step of the Lyapunov-Schmidt method series expansion in the small parameter α is applied.
- Published
- 2021
43. Conceptualizing Interactional Learning Targets for the Second Language Curriculum
- Author
-
Thorsten Huth
- Subjects
Vocabulary ,Second language ,Grammar ,media_common.quotation_subject ,ComputingMilieux_COMPUTERSANDEDUCATION ,Language education ,Psychology ,Curriculum ,Competence (human resources) ,Sentence ,Linguistics ,Term (time) ,media_common - Abstract
The skill, awareness, and perhaps the competence, of language learners to engage in specific interactional behaviors are currently encapsulated in the term Interactional Competence (IC). While IC-oriented teaching materials propose learning targets beyond vocabulary and sentence level grammar, their action-oriented view of what language is and how language works contrasts with basic notions of language and proficiency currently reflected in transnational curriculum guidelines and assessment protocols. This paper illustrates how IC-oriented teaching materials can be conceptualized for, and used in, the second language classroom and argues that their use encourages a rethinking of the notions of language and proficiency for second language teaching.
- Published
- 2021
44. Scaring Away the Spectre of Equivocation: A Comment
- Author
-
Sonja Schierbaum
- Subjects
State (polity) ,Metaphor ,Argument ,media_common.quotation_subject ,Intentionality ,Equivocation ,Sociology ,media_common ,Term (time) ,Epistemology - Abstract
My general aim in commenting on Baltuța’s paper is to elucidate the metaphor of a dialogue she uses to characterize the general, methodological framework of her undertaking. For this purpose, I turn to a certain strand of the contemporary discussion of the role of the historian of philosophy. According to Baltuța, the determination of the limits of such a dialogue is a matter of degree, not of principle. I agree with her. My concern is that the determination of the limits of the dialogue is thwarted if the central, contemporary notion that is taken to be applicable to the historical account in question is used equivocally. In my view, there is a serious concern that Baltuța’s account equivocates on the term “intentional,” making it difficult to determine the limits of the dialogue between contemporary and medieval views on the intentionality of pain. My specific aim in this comment is to show that although Baltuța equivocates on the term “intentional,” her analysis of Kilwardby’s account can be adjusted such that her argument succeeds in showing that pain is in fact an intentional state for Kilwardby.
- Published
- 2021
45. Making Things Explainable vs Explaining: Requirements and Challenges Under the GDPR
- Author
-
Fabio Vitali, Francesco Sovrano, Monica Palmirani, V. Rodríguez-Doncel, M. Palmirani, M. Araszkiewicz, P. Casanovas, U. Pagallo, G. Sartor, Sovrano, Francesco, Vitali, Fabio, and Palmirani, Monica
- Subjects
Point (typography) ,Computer science ,05 social sciences ,050301 education ,02 engineering and technology ,Space (commercial competition) ,Trustworthy AI, explanatorY AI (YAI), XAI, HCI ,Term (time) ,Epistemology ,Identification (information) ,General Data Protection Regulation ,0202 electrical engineering, electronic engineering, information engineering ,media_common.cataloged_instance ,020201 artificial intelligence & image processing ,Narrative ,European union ,User needs ,0503 education ,media_common - Abstract
The European Union (EU) through the High-Level Expert Group on Artificial Intelligence (AI-HLEG) and the General Data Protection Regulation (GDPR) has recently posed an interesting challenge to the eXplainable AI (XAI) community, by demanding a more user-centred approach to explain Automated Decision-Making systems (ADMs). Looking at the relevant literature, XAI is currently focused on producing explainable software and explanations that generally follow an approach we could term One-Size-Fits-All, that is unable to meet a requirement of centring on user needs. One of the causes of this limit is the belief that making things explainable alone is enough to have pragmatic explanations. Thus, insisting on a clear separation between explainabilty (something that can be explained) and explanations, we point to explanatorY AI (YAI) as an alternative and more powerful approach to win the AI-HLEG challenge. YAI builds over XAI with the goal to collect and organize explainable information, articulating it into something we called user-centred explanatory discourses. Through the use of explanatory discourses/narratives we represent the problem of generating explanations for Automated Decision-Making systems (ADMs) into the identification of an appropriate path over an explanatory space, allowing explainees to interactively explore it and produce the explanation best suited to their needs.
- Published
- 2021
46. New Work as an Opportunity for Performance Excellence
- Author
-
Marc Helmold
- Subjects
Fact sheet ,Knowledge management ,Work (electrical) ,business.industry ,Excellence ,Association (object-oriented programming) ,media_common.quotation_subject ,Core competency ,Set (psychology) ,business ,Psychology ,Term (time) ,media_common - Abstract
Competencies are a set of integrated knowledge, abilities and attributes that translate into behaviours and help define, in greater detail, what is needed to successfully perform the job. Competencies are not skills, although they are similar. Skills are learned, while competencies are inherent qualities an individual possesses, combining skills, knowledge and the ability. The term competency has two main meanings of the term have been identified, one referring to the outputs, or results of training—that is, competent performance. The other definition referring to the inputs, or underlying attributes, required of a person to achieve competent performance. Each definition has been used to describe both individual and organizational competencies (Hoffmann, 1999). The National Association of Colleges and Employers (NACE) (NACE, 2020) recently released a fact sheet defining seven core competencies that form career readiness
- Published
- 2021
47. Knowing Well-being: A History of Data
- Author
-
Susan Oman
- Subjects
media_common.quotation_subject ,Ancient philosophy ,Well-being ,Happiness ,Measure (physics) ,Sociology ,media_common ,Term (time) ,Epistemology - Abstract
What is well-being? Well-being has become synonymous with the multi-billion-dollar wellness industry, whilst also being rooted in ancient philosophy and religious practices. It has no universal definition across time, place or scientific discipline, yet the very term ‘statistics’ was invented to measure human happiness.This chapter contextualises the history of well-being data and development as one which is tied to political and technological change, firstly, in the desire to monitor human welfare, and secondly, for policy. Public management strategies embraced economic approaches to auditing, as a means to define value and efficiency in social policy choices. The chapter considers how well-being data became co-opted into an ostensibly rational process of decision-making and evaluation, becoming a tool of policy—for good and bad.
- Published
- 2021
48. Long Term Predictions of NO2 Average Values via Deep Learning
- Author
-
Stefano Bilotta, Gianni Pantaleo, Daniele Cenni, Enrico Collini, Paolo Nesi, Pierfrancesco Bellini, and Michela Paolucci
- Subjects
Operations research ,Computer science ,Smart city ,Dashboard (business) ,media_common.cataloged_instance ,Context (language use) ,European union ,Traffic flow ,Air quality index ,Field (computer science) ,media_common ,Term (time) - Abstract
Forecasting future values of air quality related metrics and specific pollutant concentration could be of pivotal importance in recent Smart City perspectives. A number of pollutants are dangerous for people’s health and impact on environment and climate. In order to control and reduce the emissions, national and international organizations have defined guidelines and targeted limits to be respected currently, and to be progressively reduced along the year/months. On this regard, the European Union has set limits for the concentration of the yearly mean value of NO2 which must not exceed 40 µg/m3. To this end, in this paper, we propose a model and tool to compute long terms predictions, up to 180 days in advance, of the progressive mean value of NO2 with a precision needed to enable decision makers to perform corrections. The solution proposed is based on machine learning approach taking into account measures of pollutant, traffic flow, weather and environmental variables coming from sensors on the field. A comparison of different techniques has been provided. The research activity has been developed in the context to TRAFAIR CEF project of EC which aimed to study the effect of traffic and of other human activities on NO and NO2. The data and the solution have been developed by exploiting the Snap4City platform; the validation of the solution has been performed by using actual measured data from years 2014 to 2020 in the area of Florence, Italy. The results are accessible via a monitoring dashboard on Snap4City which reports real time values and predictions in real time.
- Published
- 2021
49. Epilogue: A Fictional Conversation Between Father and Daughter
- Author
-
Chun-Kwok Lau
- Subjects
Daughter ,Psychoanalysis ,Prologue ,media_common.quotation_subject ,Multiculturalism ,Conversation ,Sociology ,Set (psychology) ,Construct (philosophy) ,media_common ,Term (time) - Abstract
In this final chapter, I return to the puzzle I set out in the Prologue. I create a fictional conversation between my daughter and I, in which I deliberate how we are trying to construct our fluid identities in this multicultural world through our life experience, or in another term, our education.
- Published
- 2021
50. Function-Based Treatments for Severe Problem Behavior
- Author
-
Nathan A. Call and Sarah K. Slocum
- Subjects
Antecedent (behavioral psychology) ,Aggression ,media_common.quotation_subject ,Psychological intervention ,medicine ,Extinction (psychology) ,medicine.symptom ,Psychology ,Function (engineering) ,Differential reinforcement ,Cognitive psychology ,Term (time) ,media_common - Abstract
Many individuals, especially those with intellectual and developmental disabilities, display problem behavior in the form of aggression, self-injurious behavior, and disruption, among others. Treatments based on the function or reason the problem behavior occurs produce better outcomes than those selected arbitrarily. The current chapter presents seminal and current literature on common function-based treatments for problem behavior, including but not limited to extinction, differential reinforcement, and antecedent interventions. This chapter also includes research and recommendations for approaches to make function-based treatments more feasible for interventionists in the long term. Finally, future research directions in the area of function-based treatments are presented.
- Published
- 2021
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.