48 results on '"Jay McCarthy"'
Search Results
2. Model-Checking Task Parallel Programs for Data-Race.
- Author
-
Radha Nakade, Eric Mercer, Peter Aldous, and Jay McCarthy
- Published
- 2018
- Full Text
- View/download PDF
3. Deploying Exploring Computer Science Statewide.
- Author
-
Helen H. Hu, Cecily Heiner, and Jay McCarthy
- Published
- 2016
- Full Text
- View/download PDF
4. Bithoven: Gödel encoding of chamber music and functional 8-bit audio synthesis.
- Author
-
Jay McCarthy
- Published
- 2016
- Full Text
- View/download PDF
5. Running Probabilistic Programs Backwards.
- Author
-
Neil Toronto, Jay McCarthy, and David Van Horn
- Published
- 2015
- Full Text
- View/download PDF
6. Proving MCAPI executions are correct using SMT.
- Author
-
Yu Huang, Eric Mercer, and Jay McCarthy
- Published
- 2013
- Full Text
- View/download PDF
7. Computing in Cantor's Paradise with λ ZFC.
- Author
-
Neil Toronto and Jay McCarthy
- Published
- 2012
- Full Text
- View/download PDF
8. Modeling Asynchronous Message Passing for C Programs.
- Author
-
Everett Morse, Nick Vrvilo, Eric Mercer, and Jay McCarthy
- Published
- 2012
- Full Text
- View/download PDF
9. Temporal higher-order contracts.
- Author
-
Tim Disney, Cormac Flanagan, and Jay McCarthy
- Published
- 2011
- Full Text
- View/download PDF
10. From Bayesian Notation to Pure Racket via Discrete Measure-Theoretic Probability in λ ZFC.
- Author
-
Neil Toronto and Jay McCarthy
- Published
- 2010
- Full Text
- View/download PDF
11. Newly-single and loving it: improving higher-order must-alias analysis with heap fragments
- Author
-
Jay McCarthy and Kimball Germane
- Subjects
Recursion ,Computer science ,Programming language ,Thread (computing) ,Conflation ,computer.software_genre ,Alias analysis ,Sketch ,Control flow analysis ,Resource (project management) ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,Safety, Risk, Reliability and Quality ,computer ,Software ,Heap (data structure) - Abstract
Theories of higher-order must-alias analysis, often under the guise of environment analysis, provide deep behavioral insight. But these theories---in particular those that are most insightful otherwise---can reason about recursion only in limited cases. This weakness is not inherent to the theories but to the frameworks in which they're defined: machine models which thread the heap through evaluation. Since these frameworks allocate each abstract resource in the heap, the constituent theories of environment analysis conflate co-live resources identified in the abstract, such as recursively-created bindings. We present heap fragments as a general technique to allow these theories to reason about recursion in a general and robust way. We instantiate abstract counting in a heap-fragment framework and compare its performance to a precursor entire-heap framework. We also sketch an approach to realizing binding invariants, a more powerful environment analysis, in the heap-fragment framework.
- Published
- 2021
12. Model-checking task-parallel programs for data-race
- Author
-
Radha Nakade, Eric Mercer, Peter Aldous, Kyle Storey, Benjamin Ogles, Joshua Hooker, Sheridan Jacob Powell, and Jay McCarthy
- Subjects
Software - Published
- 2019
13. Brief of Amicus Curiae Interdisciplinary Research Team on Programmer Creativity Addressing Expressions and Ideas
- Author
-
Kavitha Chandra, Firas Khatib, Ralph D. Clifford, Trina C. Kershaw, and Jay McCarthy
- Subjects
World Wide Web ,Source lines of code ,Copying ,Fair use ,Computer program ,Computer science ,Interoperability ,Android (operating system) ,Programmer ,Oracle - Abstract
Google v. Oracle is a case involving the meaning of a copyright in computer software. The trial determined that Google copied many thousands of lines of code that had been written by Oracle. The code that was taken defined the existence of specific computing functions, determining which of two number is the larger one, as an example. Although Google did take these definitions, its programmers re-wrote the code that was needed to perform the function. Once taken, the code was used by Google to create (and then sell) the Android system for cell phones. The first major issue in the case is understanding what owning a copyright in computer software means. All agree that a copyright only protects an “expression” and does not protect any underlying ideas. The first dispute is whether the defining code is an expression or an idea. The second issue in the case is the meaning of the “fair use” defense where copying is allowed even though the copyright exists. Here, the main question is whether allowing computer programs to be written in a way that favors easy interoperability constitutes fair use. The brief submitted by the team of researchers based at U.Mass. Dartmouth and U.Mass. Lowell and written by Professor Clifford of the U.Mass. Law School only address the first issue. It argues that the declaring code that was taken is clearly expressive based on the Team’s research findings that there is significant variation of expression shown in all computer programs, even the most basic.
- Published
- 2020
14. A Coq library for internal verification of running-times
- Author
-
Burke Fetscher, Daniel Feltey, Robert Bruce Findler, Max S. New, and Jay McCarthy
- Subjects
Structure (mathematical logic) ,Insertion sort ,Fibonacci number ,Computer science ,Programming language ,020207 software engineering ,0102 computer and information sciences ,02 engineering and technology ,computer.software_genre ,01 natural sciences ,Tree (data structure) ,Tree traversal ,010201 computation theory & mathematics ,Monad (non-standard analysis) ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,0202 electrical engineering, electronic engineering, information engineering ,Code (cryptography) ,Merge sort ,computer ,Software - Abstract
This paper presents a Coq library that lifts an abstract yet precise notion of running-time into the type of a function. Our library is based on a monad that counts abstract steps. The monad's computational content, however, is simply that of the identity monad so programs written in our monad (that recur on the natural structure of their arguments) extract into idiomatic OCaml code. We evaluated the expressiveness of the library by proving that red-black tree insertion and search, merge sort, insertion sort, various Fibonacci number implementations, iterated list insertion, various BigNum operations, and Okasaki's Braun Tree algorithms all have their expected running times.
- Published
- 2018
15. A programmable programming language
- Author
-
Shriram Krishnamurthi, Sam Tobin-Hochstadt, Matthew Flatt, Matthias Felleisen, Eli Barzilay, Jay McCarthy, and Robert Bruce Findler
- Subjects
Software ,General Computer Science ,Computer science ,business.industry ,Programming language ,0202 electrical engineering, electronic engineering, information engineering ,020207 software engineering ,020201 artificial intelligence & image processing ,02 engineering and technology ,business ,computer.software_genre ,computer - Abstract
As the software industry enters the era of language-oriented programming, it needs programmable programming languages.
- Published
- 2018
16. A systematic study of the photo-disintegration of germanium isotopes
- Author
-
Jon Jay McCarthy
- Published
- 2018
17. Model-Checking Task Parallel Programs for Data-Race
- Author
-
Eric Mercer, Jay McCarthy, Radha Nakade, and Peter Aldous
- Subjects
Model checking ,Schedule ,Correctness ,Computer science ,Programming language ,Context (computing) ,020207 software engineering ,02 engineering and technology ,Software_PROGRAMMINGTECHNIQUES ,Cilk ,Static analysis ,computer.software_genre ,Task (project management) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Formal verification ,computer ,computer.programming_language - Abstract
Data-race detection is the problem of determining if a concurrent program has a data-race in some execution and input; it has been long studied and often solved. The research in this paper reprises the problem in the context of task parallel programs with the intent to prove, via model checking, the absence of data-race on any feasible schedule for a given input. Many of the correctness properties afforded by task parallel programming models such as OpenMP, Cilk, X10, Chapel, Habanero, etc. rely on data-race freedom. Model checking for data-race, presented here, is in contrast to recent work using run-time monitoring, log analysis, or static analysis which are complete or sound but never both. The model checking algorithm builds a happens-before relation from the program execution and uses that relation to detect data-race similar to many solutions that reason over a single observed execution. Unlike those solutions, model checking generates additional program schedules sufficient to prove data-race freedom over all schedules on the given input. The approach is evaluated in a Java implementation of Habanero using the JavaPathfinder model checker. The results, when compared to existing data-race detectors in Java Pathfinder, show a significant reduction in the time required for proving data race freedom.
- Published
- 2018
18. Fair enumeration combinators
- Author
-
Burke Fetscher, Jay McCarthy, Max S. New, and Robert Bruce Findler
- Subjects
Theoretical computer science ,Semantics (computer science) ,Computer science ,020207 software engineering ,Natural number ,0102 computer and information sciences ,02 engineering and technology ,Mathematical proof ,01 natural sciences ,010201 computation theory & mathematics ,ComputingMethodologies_DOCUMENTANDTEXTPROCESSING ,0202 electrical engineering, electronic engineering, information engineering ,Enumeration ,Element (category theory) ,Combinatory logic ,Bijection, injection and surjection ,Algorithm ,Software ,Generator (mathematics) - Abstract
Enumerations represented as bijections between the natural numbers and elements of some given type have recently garnered interest in property-based testing because of their efficiency and flexibility. There are, however, many ways of defining these bijections, some of which are better than others. This paper offers a new property of enumeration combinators calledfairnessthat identifies enumeration combinators that are better suited to property-based testing. Intuitively, the result of a fair combinator indexes into its argument enumerations equally when constructing its result. For example, extracting thenth element from our enumeration of three-tuples indexes about$\sqrt[3]{n}$elements into each of its components instead of, say, indexing$\sqrt[2]{n}$into one and$\sqrt[4]{n}$into the other two, as you would if a three-tuple were built out of nested pairs. Similarly, extracting thenth element from our enumeration of a three-way union returns an element that is$\frac{n}{3}$into one of the argument enumerators. The paper presents a semantics of enumeration combinators, a theory of fairness, proofs establishing fairness of our new combinators and that some combinations of fair combinators are not fair. We also report on an evaluation of fairness for the purpose of finding bugs in programming-language models. We show that fair enumeration combinators have complementary strengths to an existing, well-tuned ad hoc random generator (better on short time scales and worse on long time scales) and that using unfair combinators is worse across the board.
- Published
- 2017
19. Practically Accurate Floating-Point Math
- Author
-
Neil Toronto and Jay McCarthy
- Subjects
IEEE 754-1985 ,Floating point ,General Computer Science ,Computer science ,Decimal floating point ,General Engineering ,Binary scaling ,Floating-point unit ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Double-precision floating-point format ,Parallel computing ,Minifloat ,Single-precision floating-point format ,Computational science ,IBM Floating Point Architecture ,NaN ,Fixed-point arithmetic ,x87 ,Half-precision floating-point format - Abstract
With the right tools, floating-point code can be debugged like any other code, drastically improving its accuracy and reliability.
- Published
- 2014
20. Special issue on Trends in Functional Programming 2013/14
- Author
-
Jay McCarthy and Jurriaan Hage
- Subjects
Functional programming ,Computer Networks and Communications ,Computer science ,business.industry ,Software engineering ,business ,Software - Published
- 2018
21. Proceedings of the 4th and 5th International Workshop on Trends in Functional Programming in Education
- Author
-
Johan Jeuring and Jay McCarthy
- Subjects
Engineering ,Functional programming ,Engineering management ,business.industry ,business - Published
- 2016
22. Deploying Exploring Computer Science Statewide
- Author
-
Cecily Heiner, Jay McCarthy, and Helen H. Hu
- Subjects
Class (computer programming) ,Medical education ,Computer science ,05 social sciences ,050301 education ,02 engineering and technology ,Software deployment ,020204 information systems ,ComputingMilieux_COMPUTERSANDEDUCATION ,0202 electrical engineering, electronic engineering, information engineering ,0503 education ,Curriculum ,Simulation ,Graduation - Abstract
Exploring Computer Science (ECS) is a high school introductory computer science class designed to increase student interest in CS. Utah is the first state to offer ECS statewide and use it to meet a high school graduation requirement. Over the past four years, 150 teachers have been trained as Utah ECS teachers and over 10,000 Utah students have taken the class. The Utah initiative is unique because it is the first to deploy ECS in a non-urban environment and with a modified half-year curriculum that includes no additional equipment costs. This paper discusses how the Utah deployment was organized, reports its results and unique difficulties, and offers lessons for deployments with similar characteristics: statewide, rural, and limited resources.
- Published
- 2016
23. A Coq Library for Internal Verification of Running-Times
- Author
-
Max S. New, Burke Fetscher, Jay McCarthy, Daniel Feltey, and Robert Bruce Findler
- Subjects
Insertion sort ,Structure (mathematical logic) ,Code (set theory) ,Fibonacci number ,Computer science ,Programming language ,020207 software engineering ,02 engineering and technology ,Monad (functional programming) ,computer.software_genre ,Tree (data structure) ,Tree traversal ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Merge sort ,computer - Abstract
This paper presents a Coq library that lifts an abstract yet precise notion of running-time into the type of a function. Our library is based on a monad that counts abstract steps, controlled by one of the monadic operations. The monad’s computational content, however, is simply that of the identity monad so programs written in our monad (that recur on the natural structure of their arguments) extract into idiomatic OCaml code. We evaluated the expressiveness of the library by proving that red-black tree insertion and search, merge sort, insertion sort, Fibonacci, iterated list insertion, BigNum addition, and Okasaki’s Braun Tree algorithms all have their expected running times.
- Published
- 2016
24. Trends in Functional Programming : 15th International Symposium, TFP 2014, Soesterberg, The Netherlands, May 26-28, 2014. Revised Selected Papers
- Author
-
Jurriaan Hage, Jay McCarthy, Jurriaan Hage, and Jay McCarthy
- Subjects
- Computer programming, Compilers (Computer programs), Computer science, Artificial intelligence
- Abstract
This book constitutes the thoroughly refereed revised selected papers of the 15th International Symposium on Trends in Functional Programming, TFP 2014, held in Soesterberg, The Netherlands, in May 2014. The 8 revised full papers included in this volume were carefully and selected from 22 submissions. TFP is an international forum for researchers with interests in all aspects of functional programming, taking a broad view of current and future trends in the area.
- Published
- 2014
25. Implementation and use of the PLT scheme Web server
- Author
-
Paul T. Graunke, Matthias Felleisen, Shriram Krishnamurthi, Greg Pettyjohn, Jay McCarthy, and Peter Walton Hopkins
- Subjects
Scheme (programming language) ,Web server ,Conference management ,Database ,Application server ,Computer science ,Process (engineering) ,business.industry ,Computational intelligence ,computer.software_genre ,Computer Science Applications ,Program development ,Software engineering ,business ,computer ,Software ,computer.programming_language - Abstract
The PLT Scheme Web Server uses continuations to enable a natural, console-like program development style. We describe the implementation of the server and its use in the development of an application for managing conference paper reviews. In the process of developing this application, we encountered subtle forms of interaction not directly addressed by using continuations. We discuss these subtleties and offer solutions that have been successfully deployed in our application. Finally, we present some details on the server's performance, which is comparable to that of the widely-used Apache Web server.
- Published
- 2007
26. Running Probabilistic Programs Backwards
- Author
-
David Van Horn, Neil Toronto, and Jay McCarthy
- Subjects
Domain-specific language ,Generality ,Theoretical computer science ,Computer science ,Semantics (computer science) ,business.industry ,Carry (arithmetic) ,Probabilistic logic ,Machine learning ,computer.software_genre ,Bayesian inference ,Order (exchange) ,Computer Science::Programming Languages ,Artificial intelligence ,business ,computer ,Probabilistic relevance model - Abstract
Many probabilistic programming languages allow programs to be run under constraints in order to carry out Bayesian inference. Running programs under constraints could enable other uses such as rare event simulation and probabilistic verification—except that all such probabilistic languages are necessarily limited because they are defined or implemented in terms of an impoverished theory of probability. Measure-theoretic probability provides a more general foundation, but its generality makes finding computational content difficult.
- Published
- 2015
27. Trends in Functional Programming
- Author
-
Jurriaan Hage and Jay McCarthy
- Subjects
Functional programming ,Programming language ,Computer science ,computer.software_genre ,computer - Published
- 2015
28. Trends in Functional Programming : 14th International Symposium, TFP 2013, Provo, UT, USA, May 14-16, 2013, Revised Selected Papers
- Author
-
Jay McCarthy and Jay McCarthy
- Subjects
- Computer programming, Compilers (Computer programs), Computer science
- Abstract
This book constitutes the thoroughly refereed revised selected papers of the 14th International Symposium on Trends in Functional Programming, TFP 2013, held in Provo, UT, USA in May 2013. The 10 revised full papers included in this volume were carefully and selected from 27 submissions. They cover topics such as distributed systems, education, functional language implementation, hardware synthesis, static analysis, testing and total programming.
- Published
- 2013
29. The Racket Manifesto
- Author
-
Matthias Felleisen and Robert Bruce Findler and Matthew Flatt and Shriram Krishnamurthi and Eli Barzilay and Jay McCarthy and Sam Tobin-Hochstadt, Felleisen, Matthias, Findler, Robert Bruce, Flatt, Matthew, Krishnamurthi, Shriram, Barzilay, Eli, McCarthy, Jay, Tobin-Hochstadt, Sam, Matthias Felleisen and Robert Bruce Findler and Matthew Flatt and Shriram Krishnamurthi and Eli Barzilay and Jay McCarthy and Sam Tobin-Hochstadt, Felleisen, Matthias, Findler, Robert Bruce, Flatt, Matthew, Krishnamurthi, Shriram, Barzilay, Eli, McCarthy, Jay, and Tobin-Hochstadt, Sam
- Abstract
The creation of a programming language calls for guiding principles that point the developers to goals. This article spells out the three basic principles behind the 20-year development of Racket. First, programming is about stating and solving problems, and this activity normally takes place in a context with its own language of discourse; good programmers ought to formulate this language as a programming language. Hence, Racket is a programming language for creating new programming languages. Second, by following this language-oriented approach to programming, systems become multi-lingual collections of interconnected components. Each language and component must be able to protect its specific invariants. In support, Racket offers protection mechanisms to implement a full language spectrum, from C-level bit manipulation to soundly typed extensions. Third, because Racket considers programming as problem solving in the correct language, Racket also turns extra-linguistic mechanisms into linguistic constructs, especially mechanisms for managing resources and projects. The paper explains these principles and how Racket lives up to them, presents the evaluation framework behind the design process, and concludes with a sketch of Racket's imperfections and opportunities for future improvements.
- Published
- 2015
- Full Text
- View/download PDF
30. Proving MCAPI executions are correct using SMT
- Author
-
Jay McCarthy, Eric Mercer, and Yu Huang
- Subjects
Software portability ,Correctness ,Schedule (computer science) ,Debugging ,Computer science ,Programming language ,MCAPI ,media_common.quotation_subject ,Message passing ,Parallel computing ,computer.software_genre ,computer ,media_common - Abstract
Asynchronous message passing is an important paradigm in writing applications for embedded heterogeneous multicore systems. The Multicore Association (MCA), an industry consortium promoting multicore technology, is working to standardize message passing into a single API, MCAPI, for bare metal implementation and portability across platforms. Correctness in such an API is difficult to reason about manually, and testing against reference solutions is equally difficult as reference solutions implement an unknown set of allowed behaviors, and programmers have no way to directly control API internals to expose or reproduce errors. This paper provides a way to encode an MCAPI execution as a Satisfiability Modulo Theories (SMT) problem, which if satisfiable, yields a feasible execution schedule on the same trace, such that it resolves non-determinism in the MCAPI runtime in a way that it now fails user provided assertions. The paper proves the problem is NP-complete. The encoding is useful for test, debug, and verification of MCAPI program execution. The novelty in the encoding is the direct use of match pairs (potential send and receive couplings). Match-pair encoding for MCAPI executions, when compared to other encoding strategies, is simpler to reason about, results in significantly fewer terms in the SMT problem, and captures feasible behaviors that are ignored in previously published techniques. Further, to our knowledge, this is the first SMT encoding that is able to run in infinite-buffer semantics, meaning the runtime has unlimited internal buffering as opposed to no internal buffering. Results demonstrate that the SMT encoding, restricted to zero-buffer semantics, uses fewer clauses when compared to another zero-buffer technique, and it runs faster and uses less memory. As a result the encoding scales well for programs with high levels of non-determinism in how sends and receives may potentially match.
- Published
- 2013
31. Teaching garbage collection without implementing compiler or interpreters
- Author
-
Arjun Guha, Jay McCarthy, Gregory H. Cooper, Shriram Krishnamurthi, and Robert Bruce Findler
- Subjects
Manual memory management ,Computer science ,business.industry ,Programming language ,media_common.quotation_subject ,computer.software_genre ,Debugging ,Garbage in, garbage out ,ComputingMilieux_COMPUTERSANDEDUCATION ,Compiler ,Software engineering ,business ,computer ,Garbage ,Interpreter ,Heap (data structure) ,Garbage collection ,media_common - Abstract
Given the widespread use of memory-safe languages, students must understand garbage collection well. Following a constructivist philosophy, an effective approach would be to have them implement garbage collectors. Unfortunately, a full implementation depends on substantial knowledge of compilers and runtime systems, which many courses do not cover or cannot assume.This paper presents an instructive approach to teaching GC, where students implement it atop a simplified stack and heap. Our approach eliminates enormous curricular dependencies while preserving the essence of GC algorithms. We take pains to enable testability, comprehensibility, and facilitates debugging. Our approach has been successfully classroom-tested for several years at several institutions.
- Published
- 2013
32. Run your research
- Author
-
Carl Eastlund, Robert Bruce Findler, Matthias Felleisen, Matthew Flatt, Sam Tobin-Hochstadt, Christos Dimoulas, John Clements, Jon Rafkind, Jay McCarthy, and Casey Klein
- Subjects
Computer science ,Programming language ,Object language ,Natural language programming ,Specification language ,Semantics ,computer.software_genre ,Computer Graphics and Computer-Aided Design ,Very high-level programming language ,Language primitive ,Program analysis ,High-level programming language ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,Formal language ,Programming language specification ,Data control language ,First-generation programming language ,Low-level programming language ,computer ,Software - Abstract
Formal models serve in many roles in the programming language community. In its primary role, a model communicates the idea of a language design; the architecture of a language tool; or the essence of a program analysis. No matter which role it plays, however, a faulty model doesn't serve its purpose. One way to eliminate flaws from a model is to write it down in a mechanized formal language. It is then possible to state theorems about the model, to prove them, and to check the proofs. Over the past nine years, PLT has developed and explored a lightweight version of this approach, dubbed Redex. In a nutshell, Redex is a domain-specific language for semantic models that is embedded in the Racket programming language. The effort of creating a model in Redex is often no more burdensome than typesetting it with LaTeX; the difference is that Redex comes with tools for the semantics engineering life cycle.
- Published
- 2012
33. Modeling Asynchronous Message Passing for C Programs
- Author
-
Everett Allen Morse, Jay McCarthy, Nick Vrvilo, and Eric Mercer
- Subjects
Model checking ,Multi-core processor ,Interface (Java) ,Computer science ,Programming language ,Concurrency ,media_common.quotation_subject ,MCAPI ,Message passing ,computer.software_genre ,Debugging ,Asynchronous communication ,computer ,media_common - Abstract
This paper presents a formal modeling paradigm that is callable from C, the dominant language for embedded systems programming, for message passing APIs that provides reasonable assurance that the model correctly captures intended behavior. The model is a suitable reference solution for the API, and it supports putative what-if queries over API scenarios for behavior exploration, reproducibility for test and debug, full exhaustive search, and other advanced model checking analysis methods for C programs that use the API. This paper illustrates the modeling paradigm on the MCAPI interface, a growing industry standard message passing library, showing how the model exposes errors hidden by the C reference solution provided by the Multicore Association.
- Published
- 2012
34. From Bayesian Notation to Pure Racket via Discrete Measure-Theoretic Probability in λ ZFC
- Author
-
Neil Toronto and Jay McCarthy
- Subjects
Domain-specific language ,Interpretation (logic) ,Theoretical computer science ,Computer science ,Semantics (computer science) ,Racket ,Bayesian probability ,Countable set ,Discrete measure ,Notation ,computer ,computer.programming_language - Abstract
Bayesian practitioners build models of the world without regarding how difficult it will be to answer questions about them. When answering questions, they put off approximating as long as possible, and usually must write programs to compute converging approximations. Writing the programs is distracting, tedious and error-prone, and we wish to relieve them of it by providing languages and compilers. Their style constrains our work: the tools we provide cannot approximate early. Our approach to meeting this constraint is to 1) determine their notation's meaning in a suitable theoretical framework; 2) generalize our interpretation in an uncomputable, exact semantics; 3) approximate the exact semantics and prove convergence; and 4) implement the approximating semantics in Racket (formerly PLT Scheme). In this way, we define languages with at least as much exactness as Bayesian practitioners have in mind, and also put off approximating as long as possible. In this paper, we demonstrate the approach using our preliminary work on discrete (countably infinite) Bayesian models.
- Published
- 2011
35. A Semantics for Context-Sensitive Reduction Semantics
- Author
-
Steven Jaconette, Jay McCarthy, Robert Bruce Findler, and Casey Klein
- Subjects
Computer science ,business.industry ,Programming language ,Formal semantics (linguistics) ,computer.software_genre ,Operational semantics ,Action semantics ,Denotational semantics ,Well-founded semantics ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,Computational semantics ,Proof-theoretic semantics ,Artificial intelligence ,business ,Failure semantics ,computer ,Natural language processing - Abstract
This paper explores the semantics of the meta-notation used in the style of operational semantics introduced by Felleisen and Hieb. Specifically, it defines a formal system that gives precise meanings to the notions of contexts, decomposition, and plugging (recomposition) left implicit in most expositions. This semantics is not naturally algorithmic, so the paper also provides an algorithm and proves a correspondence with the declarative definition. The motivation for this investigation is PLT Redex, a domain-specific programming language designed to support Felleisen-Hieb-style semantics. This style of semantics is the de-facto standard in operational semantics and, as such, is widely used. Accordingly, our goal is that Redex programs should, as much as possible, look and behave like those semantics. Since Redex's first public release more than seven years ago, its precise interpretation of contexts has changed several times, as we repeatedly encountered reduction systems that did not behave according to their authors' intent. This paper describes the culimation of that experience. To the best of our knowledge, the semantics given here accommodates even the most complex uses of contexts available.
- Published
- 2011
36. The two-state solution
- Author
-
Jay McCarthy
- Subjects
Web server ,medicine.medical_specialty ,Programming language ,Computer science ,business.industry ,Program transformation ,Access control ,computer.software_genre ,Server ,Scalability ,medicine ,Web application ,The Internet ,Web application development ,business ,computer ,Web modeling - Abstract
Continuation-based Web servers provide advantages over traditional Web application development through the increase of expressive power they allow. This leads to fewer errors and more productivity for the programmers that adopt them. Unfortunately, existing implementation techniques force a hard choice between scalability and expressiveness.Our technique allows a smoother path to scalable, continuation-based Web programs. We present a modular program transformation that allows scalable Web applications to use third-party, higher-order libraries with higher-order arguments that cause Web interaction. Consequently, our system provides existing Web applications with more scalability through significantly less memory use than the traditional technique.
- Published
- 2010
37. Trusted Multiplexing of Cryptographic Protocols
- Author
-
Shriram Krishnamurthi and Jay McCarthy
- Subjects
Protocol (science) ,Task (computing) ,Cryptographic primitive ,Computer science ,Proof assistant ,Cryptographic protocol ,Computer security ,computer.software_genre ,Multiplexing ,Multiplexer ,computer ,Controlled Cryptographic Item - Abstract
We present an analysis that determines when it is possible to multiplex a pair of cryptographic protocols. We present a transformation that improves the coverage of this analysis on common protocol formulations. We discuss the gap between the merely possible and the pragmatic through an optimization that informs a multiplexer. We also address the security ramifications of trusting external parties for this task and evaluate our work on a large repository of cryptographic protocols. We have verified this work using the Coq proof assistant.
- Published
- 2010
38. Automatically RESTful web applications
- Author
-
Jay McCarthy
- Subjects
medicine.medical_specialty ,Web server ,Computer science ,Programming language ,business.industry ,Program transformation ,computer.software_genre ,Server ,Web design ,Scalability ,medicine ,Web application ,The Internet ,Web service ,business ,Web application development ,computer ,Web modeling - Abstract
Continuation-based Web servers provide distinct advantages over traditional Web application development: expressive power and modularity. This power leads to fewer errors and more interesting applications. Furthermore, these Web servers are more than prototypes; they are used in some real commercial applications. Unfortunately, they pay a heavy price for the additional power in the form of lack of scalability.We fix this key problem with a modular program transformation that produces scalable, continuation-based Web programs based on the REST architecture. Our programs use the same features as non-scalable, continuation-based Web programs, so we do not sacrifice expressive power for performance. In particular, we allow continuation marks in Web programs. Our system uses 10 percent (or less) of the memory required by previous approaches.
- Published
- 2009
39. Minimal backups of cryptographic protocol runs
- Author
-
Jay McCarthy and Shriram Krishnamurthi
- Subjects
Cryptographic primitive ,Backup ,Computer science ,business.industry ,Distributed computing ,Universal composability ,Proof assistant ,Context (language use) ,Cryptographic protocol ,Key management ,business ,Protocol (object-oriented programming) ,Computer network - Abstract
As cryptographic protocols execute they accumulate information such as values and keys, and evidence of properties about this information. As execution proceeds, new information becomes relevant while some old information ceases to be of use. Identifying what information is necessary at each point in a protocol run is valuable for both analysis and deployment.We formalize this necessary information as the minimal backup of a protocol. We present an analysis that determines the minimal backup at each point in a protocol run. We show that this minimal backup has many uses: it serves as a foundation for job-migration and other kinds of fault-tolerance, and also assists protocol designers understand the structure of protocols and identify potential flaws.In a cryptographic context it is dangerous to reason informally. We have therefore formalized and verified this work using the Coq proof assistant. Additionally, Coq provides a certified implementation of our analysis. Concretely, our analysis and its implementation consume protocols written in a variant of the Cryptographic Protocol Programming Language, CPPL.
- Published
- 2008
40. Cryptographic Protocol Explication and End-Point Projection
- Author
-
Jay McCarthy and Shriram Krishnamurthi
- Subjects
Explication ,Theoretical computer science ,Transformation (function) ,Cryptographic primitive ,Computer science ,Perspective (graphical) ,Proof assistant ,Cryptographic protocol ,Projection (set theory) ,Protocol (object-oriented programming) - Abstract
Cryptographic protocols are useful for engineering trust in transactions. There are several languages for describing these protocols, but these tend to capture the communications from the perspective of an individual role. In contrast, traditional protocol descriptions as found in a state of nature tend to employ a whole-protocol description, resulting in an impedance mismatch. In this paper we present two results to address this gap between human descriptions and deployable specifications. The first is an end-point projection technique that consumes an explicit whole-protocol description and generates specifications that capture the behavior of each participant role. In practice, however, many whole-protocol descriptions contain idiomatic forms of implicit specification. We therefore present our second result, a transformation that identifies and eliminates these implicit patterns, thereby preparing protocols for end-point projection. Concretely, our tools consume protocols written in our whole-protocol language, wppl , and generate role descriptions in the cryptographic protocol programming language, cppl . We have formalized and established properties of the transformations using the Coq proof assistant. We have validated our transformations by applying them successfully to most of the protocols in the spore repository.
- Published
- 2008
41. Compiling cryptographic protocols for deployment on the web
- Author
-
Shriram Krishnamurthi, John D. Ramsdell, Jay McCarthy, and Joshua D. Guttman
- Subjects
World Wide Web ,Web server ,Cryptographic primitive ,Software deployment ,Computer science ,Server ,Trust management (information system) ,Compiler ,Cryptographic protocol ,computer.software_genre ,computer ,Protocol (object-oriented programming) - Abstract
Cryptographic protocols are useful for trust engineering in Web transactions. The Cryptographic Protocol Programming Language (CPPL) provides a model wherein trust management annotations are attached to protocol actions, and are used to constrain the behavior of a protocol participant to be compatible with its own trust policy. The first implementation of CPPL generated stand-alone, single-session servers, making it unsuitable for deploying protocols on the Web. We describe a new compiler that uses a constraint-based analysis to produce multi-session server programs. The resulting programs run without persistent TCP connections for deployment on traditional Web servers. Most importantly, the compiler preserves existing proofs about the protocols. We present an enhanced version of the CPPL language, discuss the generation and use of constraints, show their use in the compiler, formalize the preservation of properties, present subtleties, and outline implementation details.
- Published
- 2007
42. Smart rehabilitation for the 21st century: The Tampa Smart Home for veterans with traumatic brain injury
- Author
-
James L. Fozard, Jan M. Jasiewicz, Jeffrey D. Craighead, William D. Kearns, Jay McCarthy, and Steven Scott
- Subjects
Adult ,Male ,Medication Systems, Hospital ,medicine.medical_specialty ,Activities of daily living ,Traumatic brain injury ,Reminder Systems ,medicine.medical_treatment ,Biomedical Technology ,Psychological intervention ,Rehabilitation Centers ,Appointments and Schedules ,Executive Function ,Young Adult ,Patient safety ,Physical medicine and rehabilitation ,Computer Systems ,medicine ,Humans ,Cognitive rehabilitation therapy ,Veterans Affairs ,Rehabilitation ,business.industry ,medicine.disease ,Polytrauma ,United States ,humanities ,United States Department of Veterans Affairs ,Brain Injuries ,Conditioning, Operant ,Patient Safety ,Medical emergency ,Cognition Disorders ,business ,Reinforcement, Psychology - Abstract
INTRODUCTION In this editorial, we report on the development of a smart-home-based cognitive prosthetic that will deliver 24/7 rehabilitation at the James A. Haley Veterans' Hospital Polytrauma Transitional Rehabilitation Program (PTRP) facility in Tampa, Florida. The Tampa Smart Home was designed to address two weaknesses identified by PTRP clinicians in the rehabilitation process for patients with traumatic brain injury (TBI): (1) patient safety and (2) inadequate timing and repetition of prompts used to overcome TBI-related cognitive and memory deficits. Smart homes monitor residents' behaviors and provide assistance for various physical and neurological disabilities [1]. The Tampa Smart Home creates a pervasive supportive environment to assist cognitive rehabilitation in patients with TBI [2-3] by continuously identifying the movements and locations of all patient residents and clinical staff. The location information permits the intelligent software to deliver customized prompts and information to the patient via numerous interactive multimedia displays located on walls throughout the PTRP. The residential setting lends itself well to the enriched interactive rehabilitative environment, in which patients with TBI are "immersed" in their rehabilitation, and leverages the "digital generation" of veterans' active technology engagement to facilitate their own recovery [4]. A powerful feature of the Tampa Smart Home is the precision of the customized therapeutic information that can be provided to the recovering veteran. Individual-level data for every interaction with clinical and medical staff and with the interactive displays are recorded continuously and analyzed using state-of-the-art data mining, which, when fully implemented, will allow staff to visualize subtle but therapeutically significant behavioral changes to better inform treatment plans and potentially prevent untoward medication effects on veterans' memory, as well as gait and balance. This approach is expected to yield important insights into the cognitive recovery process by assisting therapists in targeting problem behaviors for remediation and then linking the behaviors to automata that ensure consistently provided therapy. Consistently delivered automated interventions will shorten recovery time while complementing or reducing therapist monitoring of patient locations and activities within the facility. BACKGROUND AND RELATED WORK Department of Veterans Affairs Polytrauma Centers The signature injuries of soldiers returning from Afghanistan and Iraq are polytrauma and TBI [5-6]. In the majority of Department of Veterans Affairs (VA) clinical cases, polytrauma and TBI are caused by blast injuries from improvised explosive devices, although TBI also results from noncombat events such as motor vehicle accidents. Polytrauma is defined as injuries to two or more body systems from one event. An extreme example would be injuries that simultaneously result in limb amputation, TBI, burns, deafness, and blindness, with long-term physical and cognitive impairments and functional disabilities. TBI, while part of the constellation of injuries encompassing polytrauma, is the most serious and common injury [5]. The variable emotional, cognitive, and behavioral consequences of TBI determine the specific course of rehabilitation [3]. Mild injuries, managed properly, have excellent recovery prospects; moderate to severe injuries require specialized care and intensive early rehabilitation and often require lifelong assistance to manage routine daily activities. The VA has four polytrauma centers that serve as regional centers for medical and rehabilitation care and hubs for research and education located in Minneapolis, Minnesota; Palo Alto, California; Richmond, Virginia; and Tampa, Florida. The comprehensive medical and rehabilitation services provided include acute medical care, outpatient programs, and PTRPs. …
- Published
- 2011
43. THE ONCOMING STORM: STATE INDIAN CHILD WELFARE ACT LAWS AND THE CLASH OF TRIBAL, PARENTAL, AND CHILD RIGHTS.
- Author
-
Jr., Philip (Jay) McCarthy
- Subjects
- *
ADOPTION laws , *TRIBES , *CIVIL rights , *CHILD welfare , *ADOPTIVE parents - Abstract
The article focuses on the tribal, parental and child rights of the federal Indian Child Welfare Act of 1978 (ICWA). Topics discussed include constitutional rights of the parents and children, voluntary adoptions of the orphans of the Indian tribes and laws for the Indian tribes for state Indian child welfare. It further discusses the rights of the Indian tribes as well as the right of the parents for adoption.
- Published
- 2013
44. Systematic study of the photodisintegration of germanium isotopes
- Author
-
Jon Jay McCarthy
- Subjects
Nuclear physics ,Isotopes of germanium ,Photodisintegration ,Chemistry ,Radiochemistry - Published
- 1973
45. Running Probabilistic Programs Backwards
- Author
-
Neil Toronto, Jay McCarthy, and David Van Horn
- Subjects
FOS: Computer and information sciences ,Computer Science - Programming Languages ,Computer Science::Programming Languages ,Programming Languages (cs.PL) - Abstract
Many probabilistic programming languages allow programs to be run under constraints in order to carry out Bayesian inference. Running programs under constraints could enable other uses such as rare event simulation and probabilistic verification---except that all such probabilistic languages are necessarily limited because they are defined or implemented in terms of an impoverished theory of probability. Measure-theoretic probability provides a more general foundation, but its generality makes finding computational content difficult. We develop a measure-theoretic semantics for a first-order probabilistic language with recursion, which interprets programs as functions that compute preimages. Preimage functions are generally uncomputable, so we derive an abstract semantics. We implement the abstract semantics and use the implementation to carry out Bayesian inference, stochastic ray tracing (a rare event simulation), and probabilistic verification of floating-point error bounds., 26 pages, ESOP 2015 (to appear)
46. Proceedings of the 30th Symposium on Implementation and Application of Functional Languages, IFL 2018, Lowell, MA, USA, September 5-7, 2018.
- Author
-
Matteo Cimini and Jay McCarthy
- Published
- 2018
47. Proceedings of the 4th and 5th International Workshop on Trends in Functional Programming in Education, TFPIE 2016, Sophia-Antipolis, France, and University of Maryland, College Park, MD, USA, June 2, 2015, and June 7, 2016.
- Author
-
Johan Jeuring and Jay McCarthy
- Published
- 2016
- Full Text
- View/download PDF
48. Trends in Functional Programming - 14th International Symposium, TFP 2013, Provo, UT, USA, May 14-16, 2013, Revised Selected Papers
- Author
-
Jay McCarthy
- Published
- 2014
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.