8 results on '"Melinda R. Hess"'
Search Results
2. Assessing online resources for an engineering course in numerical methods
- Author
-
Corina M. Owens, Autar Kaw, and Melinda R. Hess
- Subjects
Class (computer programming) ,General Computer Science ,Computer science ,media_common.quotation_subject ,General Engineering ,Assessment instrument ,Multiple modes ,Education ,Domain (software engineering) ,Course (navigation) ,World Wide Web ,Helpfulness ,ComputingMilieux_COMPUTERSANDEDUCATION ,Mathematics education ,Quality (business) ,Student learning ,media_common - Abstract
To determine, improve, and refine the quality of the online resources for an engineering course in numerical methods, three assessment instruments were used to gather feedback from (1) the independent instructors of the numerical methods course, (2) the students who use the majority of the resources, and (3) the general students worldwide who use resources on an as-per-need basis. The findings of this study provide strong evidence that the use of the website modules is a valued aide to most students. The availability of information in multiple modes and formats, at any time, for the students provides them with accessible and convenient learning material that enhances traditional methods. In addition, the analyses of the open-ended items by both faculty reviewers and students provided insights into how a website used in a technical course such as Numerical Methods can be effectively organized and implemented to enhance further student learning. Results from the instructor surveys found highest ratings for the perceptions of the degree of helpfulness the modules provided in supplementing student readings and with class presentations, while the results from student surveys found highest ratings in the technological domain. © 2010 Wiley Periodicals, Inc. Comput Appl Eng Educ 20: 426–433, 2012
- Published
- 2010
- Full Text
- View/download PDF
3. Multilevel Modeling: A Review of Methodological Issues and Applications
- Author
-
John M. Ferron, Jeffrey D. Kromrey, Kristine Y. Hogarty, Thomas R. Lang, John D. Niles, Robert F. Dedrick, Reginald S. Lee, and Melinda R. Hess
- Subjects
Educational research ,Content analysis ,Management science ,Computer science ,Multilevel model ,Inference ,Covariance ,Scientific communication ,Checklist ,Education ,Coding (social sciences) - Abstract
This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and inference, was used to analyze the articles. The most common applications were two-level models where individuals were nested within contexts. Most studies were non-experimental and used nonprobability samples. The amount of data at each level varied widely across studies, as did the number of models examined. Analyses of reporting practices indicated some clear problems, with many articles not reporting enough information for a reader to critique the reported analyses. For example, in many articles, one could not determine how many models were estimated, what covariance structure was assumed, what type of centering if any was used, whether the data were consistent with assumptions, whether outliers were present, or how the models were estimated. Guidelines for researchers reporting multilevel analyses are provided.
- Published
- 2009
- Full Text
- View/download PDF
4. Effectiveness of Interactive Online Algebra Learning Tools
- Author
-
Kathy Jo Gillan, Cathy Cavanaugh, Jan Bosnick, Heather Scott, and Melinda R. Hess
- Subjects
Virtual school ,Multimedia ,Computer science ,Teaching method ,computer.software_genre ,Computer Science Applications ,Education ,Algebra ,Teacher preparation ,Component (UML) ,Online course ,ComputingMilieux_COMPUTERSANDEDUCATION ,Mathematics education ,Student learning ,Algebra over a field ,Mathematics instruction ,computer - Abstract
This study of student performance in an online Algebra course looked at the development, implementation, and evaluation of interactive tools for graphing linear equations. The study focused on an interactive tool that was evaluated with virtual school Algebra students for a challenging component of the course. The performance of these students in the course on the component was compared to the performance of students who did not use the intervention. The performance of students learning in the online course with the interactive tools was equivalent to that of not using the tools. The implications of the unique nature of the online Algebra course for teacher preparation are discussed.
- Published
- 2008
- Full Text
- View/download PDF
5. Estimation in SEM: A Concrete Example
- Author
-
Melinda R. Hess and John M. Ferron
- Subjects
Estimation ,Estimation theory ,Computation ,Maximum likelihood ,05 social sciences ,050401 social sciences methods ,Function (mathematics) ,Structural equation modeling ,0506 political science ,Education ,symbols.namesake ,0504 sociology ,Statistics ,050602 political science & public administration ,symbols ,Applied mathematics ,Statistical analysis ,Newton's method ,Social Sciences (miscellaneous) ,Mathematics - Abstract
A concrete example is used to illustrate maximum likelihood estimation of a structural equation model with two unknown parameters. The fitting function is found for the example, as are the vector of first-order partial derivatives, the matrix of second-order partial derivatives, and the estimates obtained from each iteration of the Newton-Raphson algorithm. The goal is to provide a concrete illustration to help those learning structural equation modeling bridge the gap between the verbal descriptions of estimation procedures and the mathematical definition of these procedures provided in the technical literature.
- Published
- 2007
- Full Text
- View/download PDF
6. Interval Estimates of Multivariate Effect Sizes
- Author
-
Kristine Y. Hogarty, John M. Ferron, Jeffrey D. Kromrey, and Melinda R. Hess
- Subjects
Percentile ,education.field_of_study ,Mahalanobis distance ,Multivariate statistics ,Applied Mathematics ,05 social sciences ,Population ,050401 social sciences methods ,050301 education ,Confidence interval ,Education ,0504 sociology ,Bootstrapping (electronics) ,Sample size determination ,Statistics ,Developmental and Educational Psychology ,Credible interval ,Econometrics ,education ,0503 education ,Applied Psychology ,Mathematics - Abstract
Monte Carlo methods were used to examine techniques for constructing confidence intervals around multivariate effect sizes. Using interval inversion and bootstrapping methods, confidence intervals were constructed around the standard estimate of Mahalanobis distance ( D2), two bias-adjusted estimates of D2, and Huberty’s I. Interval coverage and width were examined across conditions by adjusting sample size, number of variables, population effect size, population distribution shape, and the covariance structure. The accuracy and precision of the intervals varied considerably across methods and conditions; however, the interval inversion approach appears to be promising for D2, whereas the percentile bootstrap approach is recommended for the other effect size measures. The results imply that it is possible to obtain fairly accurate coverage estimates for multivariate effect sizes. However, interval width estimates tended to be large and uninformative, suggesting that future efforts might focus on investigating design factors that facilitate more precise estimates of multivariate effect sizes.
- Published
- 2007
- Full Text
- View/download PDF
7. Making treatment effect inferences from multiple-baseline data: the utility of multilevel modeling approaches
- Author
-
Melinda R. Hess, John M. Ferron, Susan T. Hibbard, Bethany A. Bell, and Gianna Rendina-Gobioff
- Subjects
Models, Statistical ,Average treatment effect ,Autocorrelation ,Multilevel model ,Degrees of freedom (statistics) ,Experimental and Cognitive Psychology ,Variance (accounting) ,Multiple baseline design ,Arts and Humanities (miscellaneous) ,Autoregressive model ,Reference Values ,Data Interpretation, Statistical ,Statistics ,Developmental and Educational Psychology ,Econometrics ,Humans ,Psychology (miscellaneous) ,Point estimation ,Monte Carlo Method ,General Psychology ,Algorithms ,Psychomotor Performance ,Mathematics ,Behavioral Research - Abstract
Multiple-baseline studies are prevalent in behavioral research, but questions remain about how to best analyze the resulting data. Monte Carlo methods were used to examine the utility of multilevel models for multiple-baseline data under conditions that varied in the number of participants, number of repeated observations per participant, variance in baseline levels, variance in treatment effects, and amount of autocorrelation in the Level 1 errors. Interval estimates of the average treatment effect were examined for two specifications of the Level 1 error structure (sigma(2)I and first-order autoregressive) and for five different methods of estimating the degrees of freedom (containment, residual, between-within, Satterthwaite, and Kenward-Roger). When the Satterthwaite or Kenward-Roger method was used and an autoregressive Level 1 error structure was specified, the interval estimates of the average treatment effect were relatively accurate. Conversely, the interval estimates of the treatment effect variance were inaccurate, and the corresponding point estimates were biased.
- Published
- 2009
8. A Comprehensive System For The Evaluation Of Innovative Online Instruction At A Research University: Foundations, Components, And Effectiveness
- Author
-
Thomas R. Lang, Kristine Y. Hogarty, Melinda R. Hess, Jeffrey D. Kromrey, Amy Hilbelink, and Gianna Rendina-Gobioff
- Subjects
Formative assessment ,Engineering management ,Evaluation system ,business.industry ,Computer science ,Online instruction ,Management science ,Coursework ,The Internet ,Course development ,business ,Course (navigation) - Abstract
The delivery of post-secondary coursework via the Internet continues to gain momentum. As a result, investigations into effective and appropriate methods of evaluating the effectiveness of these courses are required. In an effort to meet this challenge, this study describes the development and implementation of an evaluation system applied to new online programs at a major research university. A systematic approach to evaluation provided formative feedback on the processes and products of course development using diverse data sources including course documents, interviews and web-based surveys. Results of both quantitative and qualitative analyses support the integrity of the evaluation system and provide preliminary indications of course effectiveness based on student satisfaction.
- Published
- 2005
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.