ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (503)
  • Oxford University Press  (503)
  • American Physical Society
  • Molecular Diversity Preservation International
  • Law, Probability and Risk  (119)
  • 25378
  • Mathematics  (503)
Collection
  • Articles  (503)
Publisher
  • Oxford University Press  (503)
  • American Physical Society
  • Molecular Diversity Preservation International
Years
Topic
  • Mathematics  (503)
  • Law  (503)
  • 1
    Publication Date: 2013-09-12
    Description: The tension between the meaning of causality in science and law or public policy is well-known; however, defendants in product liability cases or industries that might be affected by a government regulation may try to convince the factfinder to require evidence of a causal relationship that meets the standards of science. From the perspective of public health, however, people may be exposed unnecessarily to a health risk during the time period between the establishment of reasonably strong evidence of a causal relationship and the overwhelming evidence required for scientific causality. The Bayesian paradigm enables one to update information from epidemiologic studies as they accumulate, providing estimates of the probability that the relative risk of a particular harm from exposure exceeds a threshold value, e.g. 2.0 or 4.0 that is sufficient to meet the preponderance of the evidence standard or to support a health initiative. In order to diminish the role of the initial prior distribution, which may be quite subjective, the first case-control study or an analysis of adverse event and case reports is used to determine two prior distributions. One is the most favourable to the defendant, or industry that might be regulated, which is consistent with the previous data. The other is centred on or near the estimated relative risk from the first study. The method is applied to the studies that linked aspirin use to Reye syndrome and demonstrates that the evidence of a causal association was sufficiently strong in 1982, when the Food and Drug Administration first proposed that the public be warned of the risk, to support the regulation. Thus, lives would have been saved had the warning been given at the end of 1982 rather than in early 1985.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2013-09-12
    Description: In law, inferences of causation are sometimes made through a structured process in which multiple participants play various roles, and make decisions concerning various logical components of the overall inference (such as legal rules, policy objectives, presumptions, evidence, burdens of proof and findings of fact). This article illustrates such a process using empirical research into compensation decisions in the USA for injuries allegedly caused by vaccinations. Empirical research into actual legal processes is essential, in order to discover how various players approach their sub-tasks of decision-making. It also provides insights for areas outside of law, such as non-monotonic logic, cognitive science, sociology and artificial intelligence.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2013-09-12
    Description: Situations of causal factual uncertainty are relatively common in law. The problems and difficulties regarding ‘factual causation’ in law point to the need of ‘evidence’ and ‘proof’ models that are adequate and capable to accommodate the tests and methodologies used to explain and demonstrate it in a legal context. Given the configuration of the situations of causal factual uncertainty and the available ‘evidence’ and ‘proof’ models, I argue that it is justified to use an ‘argumentative-narrative’ model for ‘proving causation’ in law. However, considering that each model of ‘evidence’ and ‘proof’ reveals a different kind of ‘rationality’ that can still be viewed in different ways, I also argue that we must try to match the perspective we have on the ‘rationality’ behind the chosen model of ‘evidence’ and ‘proof’ with the ‘rationality’ underlying ‘causation’ in law.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2013-09-12
    Description: At least in some cases, the values confronted in legal decision-making appear to be incommensurable. Some legal theorists resist incommensurability because they fear that this presents an overwhelming obstacle to rational decision-making. By offering a close analysis of proportionality and, more particularly, measures of proportional value satisfaction, I show that this fear is unfounded. Comparative measures of proportional value satisfaction do not require the values to be commensurable. However, assuming incommensurability presents us with the problem of public significance in the proportional satisfaction of values. When two values are commensurable, this public significance is provided by the mediating effects of the overarching third value that provides the common measure of the values. However, when this common measure is removed, then the public significance of value satisfaction must be otherwise achieved. This is why I propose an equal proportional value satisfaction as the most appropriate proportionality maximand. Under equal proportional value satisfaction, the proportional satisfaction of any one value has significance for each and every other value. This kind of public significance is interpersonal rather than impersonal (or second-personal rather than third-personal). The article then shows that the legal process that is most appropriate to equal proportionality is a process that implements defeasible legal rules.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2013-09-12
    Description: In order to allocate the risk between parties in legal adjudication, we use evidentiary techniques with the main device among them being the standard of proof (SoP). The traditional view holds the grade of probability to be the parameter that shifts when moving to different standards. However, as soon as we dig slightly deeper, an incoherent picture is being revealed. In this article, I challenge the accepted view and try to show that it faces insurmountable problems concerning the rationality, the grammatical consistency and the impact of the SoP for the acceptability of verdicts. At the end of the article, I shortly discuss the theory of epistemological contextualism and propose a framework that allows rational distinctions to be drawn between different standards of proof. In the second part of this project (forthcoming), I will defend a contextualist view according to which shifting parameter is not the grade of (aleatory) probability, but instead the Set of Epistemic Defeaters in play.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2013-09-12
    Description: This article focuses on the question of how decision makers with no relevant scientific background can (if at all) legitimately evaluate conflicting scientific expert testimonies and determine their relative reliability. Sceptics argue that non-experts can never reach justifiable conclusions regarding the merits of conflicting expert testimonies because they lack the fundamental epistemic capacity to make such judgement calls. In this article, I draw on works on epistemology, philosophy of practical reasoning, philosophy of science, science and technology studies, and legal theory in order to scrutinize recent proposals to solve the problem of conflicting scientific expert testimonies. Addressing this question is of ultimate importance due to the idea that immanent in the idea of rule-of-law there is an intellectual due process norm, which articulates that epistemically arbitrary legal decisions are also not legally justified. This article is divided into two Sections. In Section 2 , I describe the basic philosophical inquiries underlying the debate about expert testimony. In particular, I first elaborate on the philosophy of testimony and its epistemic justifications, then move to the idea of epistemic deference, and finish with philosophical accounts of expertise. Section 3 presents the problem of conflicting scientific expert testimonies and analyses recent attempts to solve it as formulated by Ward Jones, Alvin Goldman and Scott Brewer. I argue that there is no single criterion (or set of criteria) upon which the non-expert could rely in order to make a rationally justified decision in each and every case in which he faces conflicting scientific expert testimonies. The alternative view here defended is to stop looking for an epistemic panacea and accept the idea that testimonial reliability operates differently within different kinds of testimony—and differently within the same kind of testimony at different times.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2013-06-08
    Description: The criterion of bioequivalence of two drugs in infringement cases may differ from the requirements used for drug approval by the FDA. In Adams v . Perrigo , 1 the Federal Circuit examined three different sets of criteria for judging bioequivalence. The statistical properties of those criteria are explored and evaluated. Our results support the appellate court’s decision to impose less stringent requirements for bioequivalence in infringement cases.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2013-06-08
    Description: The North Carolina Racial Justice Act allows defendants to submit statistical studies of prosecutorial actions pertaining to their seeking the death penalty or in making peremptory challenges. These studies may consider data from four geographical regions: the state, county, judicial division or prosecutorial district. A study of the effect of race on peremptory challenges in death penalty cases demonstrating statistically significant disparities disadvantaging Black defendants has been submitted in several cases. This comment shows that a more appropriate statistical analysis yields much stronger statistical evidence that race entered into the peremptory challenge process in Randolph County than the affidavit submitted by the authors of the study. A subsequent sensitivity analysis indicates that in order for a characteristic to explain the highly statistically significant disparity, it would need to increase the odds of an individual being challenged by a factor of three and more than twice as many Black venire members would need to possess that characteristic as non-Blacks. Since the data examined excluded potential jurors who had been removed for cause, it may be difficult for the state to find a legitimate reason justifying the racial disparity.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2013-06-08
    Description: This article critically evaluates experiments used to justify inferences of specific source attribution (‘individualization’) to ‘100% certainty’ and ‘near-zero’ rates of error claimed by firearm toolmark examiners in court testimonies, and suggests approaches for establishing statistical foundations for firearm toolmarks practice that two recent National Academy of Science reports confirm do not currently exist. Issues that should be considered in the earliest stages of statistical foundational development for firearm toolmarks are discussed.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2013-06-08
    Description: Much debate exists between Frequentist and Bayesian methods in statistics. In the evaluation of evidence, the likelihood ratio is credited with quantifying the value of evidence in favour of one or other proposition by considering the probability of the evidence conditional on each proposition and this then converts the Bayesian prior odds into the posterior odds. Motivated by this approach, this paper considers an alternative p -value-based likelihood ratio by explicitly taking into account the behaviour of the Frequentist p -value under both hypotheses, rather than restricting focus solely on the null hypothesis. It is shown that by accommodating the alternative hypothesis, analysis leads to inferential conclusions which are consistent with Bayesian methods.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2013-06-08
    Description: This essay explores the implications of complexity for understanding both the law of evidence and the nature of the legal system. Among the propositions critically analysed is that one significant way to understand the general problem of the meaning of rationality is as a multivariate search for tools to understand and regulate a hostile environment. The law of evidence is conceptualized as a subset of this effort, at least in part, as involving a search for tools to regulate the almost infinitely complex domain of potentially relevant evidence and at the same time to accommodate policy demands. The proposition is then considered that the legal system of which the evidentiary system is a part has emergent properties that may not be deducible from its component parts, which suggests in turn that it may be, or at least has properties highly analogous to, a complex adaptive system. One implication of this analysis is that the tools of standard academic research that rely heavily on the isolation and reduction of analytical problems to manageable units to permit them to be subjected to standard deductive methodologies may need to be supplemented with analytical tools that facilitate the regulation of complex natural phenomena such as fluid dynamics. This has direct implications for such things as the conception of law as rules, and thus for the Hart–Dworkin debate that has dominated jurisprudence for 50 years. That debate may have mischaracterized the object of its inquiry, and thus the Dworkinian solution to the difficulties of positivism is inapplicable. It can certainly be shown that the Dworkinian solution is not achievable and cannot rationally be approximated. Solutions to legal problems within the legal system as a whole (as compared to any particular node within the legal system) are arrived at through a process of inference to the best explanation that occurs within a highly interconnected set of nodes similar to a neural or social network.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2013-09-12
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2013-09-12
    Description: Causation as an element of a criminal offence is different from the probative difficulties. The empirical laws that are relevant to the proof of causation, as a pure matter of fact, are not discussed here, but only causality as a category of our understanding and a general law of the intelligible world. This general law of causality is equally valid for all result crimes (e.g. homicide, bodily harm, deception offences and criminal damage). According to the European continental theory of conditions, any ‘conditio sine qua non’ is by itself a cause. Causation is established by the formula of ‘conditio’ (similar to the so-called ‘but for’ test in the common law), which corresponds to a counterfactual reasoning. However, that formula is not able to resolve adequately those cases of causal overdetermination where the result occurred by means of actions of multiple, independently intervening agents. A semantic model of the world evolution, based upon ramified temporal logic, may assist the comprehension of causal connections between human actions and the relevant results. At the end of the day, this model allows us to understand that even in situations where no kind of factual uncertainty is present, doubts upon the attribution of causation to specific agents remain. We shall conclude that the attribution of causation is not a natural problem, but a logico-legal one, that has to be dealt with by way of logico-legal criteria. Nevertheless, attribution of causation must be clearly distinguished from objective imputation of proscribed harm.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2016-06-04
    Description: The current practice of Malaysian courts in calculating the award of damages for loss of future earnings in personal injury and fatal accidents claims is the conventional multiplier–multiplicand approach without admitting any actuarial evidence. The objective is to calculate an appropriate amount to compensate the plaintiff which will restore to the position he would have been in if that particular damage had not occurred. This article attempts to develop actuarial models using the concept of human life value that can be used as a guide to determine the amount of loss of future earnings. We believe this is where actuarial scientists need to play a role in developing a new scientific model in order to acquire an appropriate amount of award, which is relevant and satisfy both plaintiff and defendant.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2016-06-04
    Description: Recently published articles have proposed the use of likelihood ratios (LRs) in determining the evidential value of finding a given number of gunshot residue (GSR) particles on a suspect. LRs depend on the probabilistic models assumed for the defence proposition (the suspect was not involved in a shooting) and the prosecutor’s proposition (the suspect was involved), and should be calculated based on data obtained in well designed experiments. However, statistical aspects of the analysis that select the appropriate model and provide uncertainty measures are rarely considered. In this article, data from Cardinetti et al. (2006 , A proposal for statistical evaluation of the detection of gunshot residues on a suspect. Scanning , 28 (3):142–147) are used to demonstrate the sensitivity of calculated LRs to the assumed model. It is shown that the Poisson model, considered by Cardinetti and others, is inappropriate and that a Negative Binomial model fits the data much better. The statistical error arising from the fact that models are estimated based on small sampled data is discussed, as well as the importance of accounting for this error. We conclude that only with a large database can statistical models be estimated accurately and LR’s be treated as valid scientific measures.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2016-06-04
    Description: We first discuss certain problems with the classical probabilistic approach for assessing forensic evidence, in particular its inability to distinguish between lack of belief and disbelief, and its inability to model complete ignorance within a given population. We then discuss Shafer belief functions, a generalization of probability distributions, which can deal with both these objections. We use a calculus of belief functions which does not use the much criticized Dempster rule of combination, but only the very natural Dempster–Shafer conditioning. We then apply this calculus to some classical forensic problems like the various island problems and the problem of parental identification. If we impose no prior knowledge apart from assuming that the culprit or parent belongs to a given population (something which is possible in our setting), then our answers differ from the classical ones when uniform or other priors are imposed. We can actually retrieve the classical answers by imposing the relevant priors, so our set-up can and should be interpreted as a generalization of the classical methodology, allowing more flexibility. We show how our calculus can be used to develop an analogue of Bayes’ rule, with belief functions instead of classical probabilities. We also discuss consequences of our theory for legal practice.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2015-06-03
    Description: A definition of causality introduced by Halpern & Pearl, which uses structural equations , is reviewed. A more refined definition is then considered, which takes into account issues of normality and typicality, which are well known to affect causal ascriptions. Causality is typically an all-or-nothing notion: either A is a cause of B or it is not. An extension of the definition of causality to capture notions of degree of responsibility and degree of blame , due to Chockler and Halpern, is reviewed. For example, if someone wins an election 11–0, then each person who votes for him is less responsible for the victory than if he had won 6–5. Degree of blame takes into account an agent's epistemic state. Roughly speaking, the degree of blame of A for B is the expected degree of responsibility of A for B , taken over the epistemic state of an agent. Finally, the structural-equations definition of causality is compared to Wright’s NESS test.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2015-06-03
    Description: The European Court of Justice has held that as from 21 December 2012, insurers may no longer charge men and women differently on the basis of scientific evidence that is statistically linked to their sex, effectively prohibiting the use of sex as a factor in the calculation of premiums and benefits for the purposes of insurance and related financial services throughout the European Union. This ruling marks a sharp turn away from the traditional view that insurers should be allowed to apply just about any risk assessment criterion, so long as it is sustained by the findings of actuarial science. The naïveté behind the assumption that insurers’ recourse to statistical data and probabilistic analysis, given their scientific nature, would suffice to keep them out of harm’s way was exposed. In this article, I look at the flaws of this assumption and question whether this judicial decision, whilst constituting a most welcome landmark in the pursuit of equality between men and women, has nonetheless gone too far by saying too little on the million dollar question of what separates admissible criteria of differentiation from inadmissible forms of discrimination.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2015-06-03
    Description: A study by the authors of determining manner of death used a method novel to forensic pathology. This article details the method used. Drawing on the methodology of evidence-based medicine, data was systematically identified and pooled to provide a robust and substantial dataset of probabilities for forensic evidential features of gunshot wounds. This provided source data for a Bayesian analysis to determine the probable manner of death. We suggest the same method can be applied to a wide variety of evidence, meeting the need for strong and reliable data highlighted by R v . T and subsequent debate.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2015-06-03
    Description: The Engineering Department of the Spanish Civil Guard has been using automatic speaker recognition systems for forensic purposes providing likelihood ratios since 2004. They are quantitatively much more modest than in the DNA field. In this context, it is essential a suitable calculation of the prior odds to figure out the posterior odds once the comparison result is expressed as likelihood ratio. These odds are under the responsibility of a Judge, and many consider unlikely that they can be quantitatively calculated in real cases. However, our experience defending in Court over 500 speaker recognition expert reports allows us to suggest how the expert may support Judges from a technical point of view to assess the odds. Technical support as referred should be preferentially provided in the preliminary investigation stage, after the expert report being issued by the laboratory, as in the course of oral hearings it is much more difficult for those who are not familiar with the new paradigm. It can be initiated upon request by the Examining Judge or any of the litigant parties. We consider this practice favourable to the equality of arms principle. The use of Bayesian networks is proposed to provide inferential assistance to the Judge when assessing the prior odds. An example of the explanation above is provided by the case of the terrorist attack against Madrid-Barajas Airport Terminal 4 perpetrated in December 2006.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2015-06-03
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2012-03-09
    Description: The role played by the required level of statistical significance in the scientific method should not be seen as analogous to that played by the standard of proof in the legal process.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2012-03-09
    Description: The Supreme Court ruled in Matrixx that statistical significance is not necessary to show that a drug caused an adverse reaction. Five circuit court decisions holding otherwise preceded this decision. This paper examines the extent to which the Supreme Court’s reasoning differed from those of the circuit courts.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2012-03-09
    Description: Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the joint evaluation of several distinct items of forensic evidence has to date received some punctual, rather than systematic, attention. Questions about the (i) relationships among a set of (usually unobservable) propositions and a set of (observable) items of scientific evidence, (ii) the joint probative value of a collection of distinct items of evidence as well as (iii) the contribution of each individual item within a given group of pieces of evidence still represent fundamental areas of research. To some degree, this is remarkable since both, forensic science theory and practice, yet many daily inference tasks, require the consideration of multiple items if not masses of evidence. A recurrent and particular complication that arises in such settings is that the application of probability theory, i.e. the reference method for reasoning under uncertainty, becomes increasingly demanding. The present paper takes this as a starting point and discusses graphical probability models, i.e. Bayesian networks, as framework within which the joint evaluation of scientific evidence can be approached in some viable way. Based on a review of existing main contributions in this area, the article here aims at presenting instances of real case studies from the author’s institution in order to point out the usefulness and capacities of Bayesian networks for the probabilistic assessment of the probative value of multiple and interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference, their representation as well as their graphical probabilistic analysis. Attention is also drawn to inferential interactions, such as redundancy, synergy and directional change. These distinguish the joint evaluation of evidence from assessments of isolated items of evidence. Together, these topics present aspects of interest to both, domain experts and recipients of expert information, because they have bearing on how multiple items of evidence are meaningfully and appropriately set into context.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2012-03-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2012-03-09
    Description: In its first foray into the labyrinth that causation in personal injury has become, the U.K. Supreme Court recently held obiter that statistical evidence alone could not establish causation. But in an earlier toxic tort case, the High Court had relied on epidemiological evidence to identify a cluster of birth defects arising in the vicinity of a contaminated land site. This recent British experience is then discussed within the wider context of the forensic role of ‘naked statistics’.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2012-03-09
    Description: Scales of conclusion in forensic interpretation play an important role in the interface between scientific work at a forensic laboratory and different bodies of the jurisdictional system of a country. Of particular importance is the use of a unified scale that allows interpretation of different kinds of evidence in one common framework. The logical approach to forensic interpretation comprises the use of the likelihood ratio as a measure of evidentiary strength. While fully understood by forensic scientists, the likelihood ratio may be hard to interpret for a person not trained in natural sciences or mathematics. Translation of likelihood ratios to an ordinal scale including verbal counterparts of the levels is therefore a necessary procedure for communicating evidence values to the police and in the courtroom. In this paper, we present a method to develop an ordinal scale for the value of evidence that can be applied to any type of forensic findings. The method is built on probabilistic reasoning about the interpretation of findings and the number of scale levels chosen is a compromise between a pragmatic limit and mathematically well-defined distances between levels. The application of the unified scale is illustrated by a number of case studies.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2012-03-09
    Description: In personal injury litigation, claimants may seek their compensation for future losses or expenses as a lump sum that is determined by the product of a multiplicand and a multiplier. The multiplicand represents the annual loss in earnings and other benefits, as assessed at the trial date, while the multiplier discounts future pecuniary values into a single present-day lump sum amount. At present, multipliers in the UK are calculated using actuarial methods and based on assumed mortality and interest rates. However, it is entirely possible that these assumptions are incorrect, and if they are, then all claimants who rely on the same set of actuarial multipliers will be affected. In this article, we investigate how the uncertainty surrounding mortality and interest rate assumptions affects the precision of actuarial multipliers. With the aid of stochastic models, we estimate the possible range of values that an actuarial multiplier can take.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2012-12-09
    Description: In 2010, a ruling in the England and Wales Appeal Court quashed a conviction in a homicide case wherein the evidence rested heavily on the association of a shoe sole and a crime scene footwear mark. This decision addressed and criticized the use of the Bayesian approach and likelihood ratios for this form of evidence. The court’s comments and the values used by the footwear mark examiner as applied to his Bayesian evaluation and likelihood ratio are discussed. A contrast is drawn to this method versus the traditional footwear mark evaluation used by footwear examiners in the USA and most other countries.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2012-12-09
    Description: This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T 1 and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2012-12-09
    Description: Forensic science evidence must be presented in a form that can be accommodated within the process of proof employed by judges and juries. This is a non-mathematical inductive process that seeks ‘the inference to best explanation’ to a standard of proof beyond reasonable doubt. The question posed is not the mathematical probability of the prosecution hypothesis but whether having regard to all the evidence before the court the prosecution hypothesis is the only explicable hypothesis, in the sense that no reasonably possible defence hypothesis remains open. The challenge is to present forensic science evidence in a form that can be accommodated within this non-mathematical inductive standard of proof. It is argued that this is most effectively achieved if that evidence is tendered as a frequency rather than as a likelihood ratio.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2012-12-09
    Description: The ability of the experienced forensic scientist to evaluate his or her results given the circumstances and propositions in a particular case and present this to the court in a clear and concise way is very important for the legal process. Court officials can neither be expected to be able to interpret scientific data, nor is it their task to do so (in our opinion). The duty of the court is rather to perform the ultimate evidence evaluation of all the information in the case combined, including police reports, statements from suspects and victims, witness reports forensic expert statements, etc. Without the aid of the forensic expert, valuable forensic results may be overlooked or misinterpreted in this process. The scientific framework for forensic interpretation stems from Bayesian theory. The resulting likelihood ratio, which may be expressed using a verbal or a numerical scale, compares how frequent are the obtained results given that one of the propositions holds with how frequent they are given that the other proposition holds. A common misunderstanding is that this approach must be restricted to forensic areas such as DNA evidence where extensive background information is present in the form of comprehensive databases. In this article we argue that the approach with likelihood ratios is equally applicable in areas where the results rely on scientific background data combined with the knowledge and experience of the forensic scientist. In such forensic areas the scale of the likelihood ratio may be rougher compared to a DNA case, but the information that is conveyed by the likelihood ratio may nevertheless be highly valuable for the court.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2012-12-09
    Description: Experts providing evidence in legal cases are universally recommended to be transparent, particularly in their reasoning, so that legal practitioners can critically check whether the conclusions are adequately supported by the results. However, when exploring the practical meaning of this recommendation it becomes clear that people have different things in mind. The UK appeal court case R v T painfully exposes the different views. In this article we argue that there can be a trade-off between clarity and transparency, and that in some cases it is impossible for the legal practitioner to be able to follow the expert’s reasoning in full detail because of the level of complexity. All that can be expected in these cases is that the legal practitioner is able to understand the reasoning up to a certain level. We propose that experts should only report the main arguments, but must make this clear and provide further details on request. Reporting guidelines should address the reasoning in more detail. Legal practitioners and scientists should not be telling each other what to do in the setting of a legal case, but in other settings more discussion will be beneficial to both. We see the likelihood ratio framework and Bayesian networks as tools to promote transparency and logic. Finally, we argue that transparency requires making clear whether a conclusion is a consensus and reporting diverging opinions on request.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2012-12-09
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2016-03-03
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2016-03-03
    Description: Taroni et al. (2016) discuss the controversial issue of parameter uncertainty in the context of forensic evidence evaluation. Although we share with the authors the main idea that the likelihood ratio (LR) framework is the best method for evaluating forensic evidence, we have a different view on this issue. The core question is: does it make sense to consider the uncertainty attached to a calculated value of the LR, and consequently, should we report a single value for the LR or in addition address its uncertainty? Taroni et al. (2016) argue for reporting a single value based on a ‘full-Bayesian’ approach, and accuse anyone who considers the uncertainty of an LR of ‘misconception of basic principles’ and ‘abuse of language’. However, their arguments presented as facts or logic are in fact choices or opinions. Furthermore, reporting a single number for the LR deprives the legal justice system of essential information needed to assess the reliability of the evidence. Therefore, we argue that forensic scientists should not only report an LR value, but also address its uncertainty and we explain why this is not a misconception or abuse of language.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2016-03-03
    Description: The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2016-03-03
    Description: Due to the uses of DNA profiling in criminal investigation and decision-making, it is ever more common that probabilistic information is discussed in courts. The people involved have varied backgrounds, as factfinders and lawyers are more trained in the use of non-probabilistic information, while forensic experts handle probabilistic information on a routine basis. Hence, it is important to have a good understanding of the sort of reasoning that happens in criminal cases, both probabilistic and non-probabilistic. In the present article, we report results on combining three normative reasoning frameworks from the literature: arguments, scenarios and probabilities. We discuss a hybrid model that connects arguments and scenarios, a method to probabilistically model possible scenarios in a Bayesian network, a method to extract arguments from a Bayesian network and a proposal to model arguments for and against different scenarios in standard probability theory. These results have been produced as parts of research projects on the formal and computational modelling of evidence. The present article reviews these results, shows how they are connected and where they differ, and discusses strengths and limitations.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2016-03-03
    Description: This case comment analyses the appropriateness of the 60/40 written/oral weighting of the promotional exams challenged by Briscoe. The plaintiff claimed that the oral exam had less disparate impact on African Americans, was a better way to access candidates’ ability and hence should be weighted more. The comment shows that when more weight is given to the oral exam, the number of African American promotions ‘increases’, but the number of Hispanic promotions and the total number of minority promotions ‘decreases’. Furthermore, the analysis demonstrates that for most of the written/oral weightings, including the one advocated by the plaintiff, the number of African Americans who would be promoted to lieutenant remains the same, supporting district court’s conclusion that the weighting used by the City did not have disparate impact on African Americans ‘as a race’. In addition, the comment illustrates that even though the oral exam had less disparate impact on African Americans, it had ‘more’ disparity on Hispanics. Considering minority candidates as a whole group, the disparities of the oral and the written exams were about the same.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2016-03-03
    Description: The presumption of innocence is sacrosanct in Anglo-legal doctrine, yet how jurors interpret it remains unknown. This experiment manipulated the alleged crime (violent, child, or sexual assault) and the defendant’s physical appearance (good, mediocre, or bad). Following Savage (1954), uncertainty about the guilt of the defendant was conceptualized in terms of prospective jurors’ willingness to gamble about whether the defendant is guilty. For each case, participants were asked to choose between a standard lottery, with specified probability of winning $100 versus nothing, and a trial gamble involving the unspecified probability of guilt (win $100) and innocence (win nothing) for the case presented. Participants indicated a preference to either play the standard lottery or the trial gamble, or indifference between the two options. Once a participant indicated indifference, the probability of guilt was set equal to the probability of winning the standard lottery. Across all 3 types of crimes and all 3 categories of appearance, median participants’ prior odds of guilt were close to 0.50, indicating that prospective jurors generally believed the defendants were just as likely to be guilty as innocent prior to the introduction of any evidence. A main effect for appearance was detected. ‘Bad’ and ‘mediocre’ defendants were perceived to be more likely to be guilty than ‘good’ defendants. There were no differences between the various alleged crimes. Overall, male participants and self-identified Republicans were the most likely to have high (〉0.65) prior probabilities of guilt. Implications for legal doctrine are discussed.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2016-03-03
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2015-12-02
    Description: Identifying specific human body fluids and establishing their presence in traces can be crucial to help reconstructing alleged incidents in criminal cases. It is up to the forensic practitioner to test for the presence of body fluids, interpret the test results and draw scientifically supported conclusions that can be used in a court of law. This study presents a Bayesian network for the interpretation of test results for human saliva based on the presence of human salivary α-amylase. The Bayesian network can be used by forensic practitioners as an exploratory tool to form their expert opinion on the presence or absence of saliva in a trace.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2015-12-02
    Description: The discipline of forensic epidemiology, a branch of forensic medicine, provides a systematic approach to the assessment of general and specific (individual) causation, with the results suitable for presentation in a court of law. In the present paper some of the methods utilized in forensic epidemiology are described, along with examples of how such methods can be reliably applied to the evaluation of specific causality in criminal and civil matters. Included in the discussion is the presentation of two case studies in applied forensic epidemiology; one in a civil action for medical negligence, and the other in a homicide investigation.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2015-12-02
    Description: Due to an error in the computer program used to summon potential jurors for service, the African-American percentage (4.17%) of jury pools was only one-half their percentage (8.25%) of the age-eligible population for about fifteen months. In June 2012, in Ambrose v. Booker , the Sixth Circuit accepted the statistical evidence as sufficient to meet the requirements for a prima facie case of unfair representation the U.S. Supreme Court established in Duren . On the same day, the Michigan S. Ct. in People v. Bryant said that the statistics were insufficient. The Michigan Court accepted a new measure, the disparity of the risk (DR), of under-representation. The DR measure is shown to be extremely stringent and an alternative measure based on the probability that a jury randomly selected from the jury pool will have fewer minorities than a jury selected from the age-eligible population is proposed. The minority fraction of individuals ultimately serving on juries also depends on the fairness of the peremptory challenges made by the parties. A method for detecting unfairness is reviewed; its effectiveness depends on the number of minorities on the venire. A reanalysis of the Michigan data shows that if the proportion of African-Americans in the jury pool equaled their proportion in the age-eligible population, prosecutors could only reduce their proportion on juries to about 80% of their proportion in the population. In contrast, the criteria for adequate representativeness based on the DR measure adopted by the Michigan Court could lead to virtually no minorities serving on actual juries as only three percent of the venires would have a sufficient number of minorities to classify a prosecutor's peremptory challenging all of them as significant. Our results indicate that when assessing statistics on the demographic mix of jury pools for legal significance, courts should consider the possible reduction in minority representation that can occur in the peremptory challenge proceedings.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2015-12-02
    Description: In violent crimes, adhesive tapes such as duct tape are often used by perpetrators e.g. to tie up a victim. In the forensic examination of such tapes many different types of traces can be found, such as finger marks and human biological traces. These traces are first interpreted at source level. However, even when it is certain that a trace was donated by the suspect this does not necessarily mean that he donated the trace while taping the victim, as he could have e.g. used the tape roll from which the pieces came previous to the crime. Therefore, the trace can also be interpreted at activity level. For this, factors such as transfer, persistence and recovery, as well as the position of the trace as it would have been on the original roll have to be taken into consideration. In this study, we have developed a Bayesian network which can aid the forensic practitioner in his interpretation. From a sensitivity analysis, we have concluded that it would be most desirable to set up further studies to determine the most likely positions of DNA on tape rolls if there has only been innocent contact.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2011-11-24
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2011-11-24
    Description: This paper discusses the analysis of cases in which the inclusion or exclusion of a particular suspect, as a possible contributor to a DNA mixture, depends on the value of a variable (the number of contributors) that cannot be determined with certainty. It offers alternative ways to deal with such cases, including sensitivity analysis and object-oriented Bayesian networks, that separate uncertainty about the inclusion of the suspect from uncertainty about other variables. The paper presents a case study in which the value of DNA evidence varies radically depending on the number of contributors to a DNA mixture: if there are two contributors, the suspect is excluded; if there are three or more, the suspect is included; but the number of contributors cannot be determined with certainty. It shows how an object-oriented Bayesian network can accommodate and integrate varying perspectives on the unknown variable and how it can reduce the potential for bias by directing attention to relevant considerations and distinguishing different sources of uncertainty. It also discusses the challenge of presenting such evidence to lay audiences.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2011-11-24
    Description: The concept of balancing is of central importance in the assessment of legislative choices and in constitutional review. In conformity with the global tendency, balancing is increasingly used in judicial practice as an argumentation technique for solving legal disputes; more and more, judges of all levels ground their decisions on the balancing of individual rights, interests, principles, needs, and values. Legal science has formulated theoretical and formal models to explain the argumentation structure of balancing and the criteria governing the argumentation process, but, in the absence of a conceptual model that encompasses all elements in play and enables a comparative mechanism to be abstracted, mapping instances of judicial practice to abstract theories is still difficult. In this context, the objective of the project described here is to allow the logic of judicial practice emerge from cases, verifying from the bottom up the assumptions of theoretical models. Starting off from a broad analysis of Italian cases, the paper aims at analysing the object of this operation, i.e. what is `balanced' and what is the nature of this process. The research was conducted by analysing the so-called `massime' (case law abstracts) of the Italian High Courts (Constitutional Court, Supreme Court, Council of State) of the administrative courts (Regional Administrative Tribunals) and of a selection of lower court decisions. The methodology is divided into an initial phase of documentary collection and storage, a second phase of conceptual modelling and a third phase of data analysis.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2011-11-24
    Description: The ‘minimax theorem’ is the most recognized theorem for determining strategies in a two-person zero-sum game. Other common strategies exist such as the ‘maximax principle’ and ‘minimize the maximum regret principle’. All these strategies follow the Von Neumann and Morgenstern linearity axiom which states that numbers in the game matrix must be cardinal utilities and can be transformed by any positive linear function f ( x )= a x + b , a 〉 0, without changing the information they convey. This paper describes risk-aversestrategies for a two-person zero-sum game where the linearity axiom may not hold. With connections to gambling theory, there is evidence to show why it can be optimal for the favourable player to adopt risk-averse strategies. Based on this approach, an arbitration value is obtained in a litigation game, where the amount awarded to the victim is less than expectation and shown to be ‘fairer’ when compared with the amount obtained using the Von Neumann and Morgenstern game theory framework.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2011-11-24
    Description: The present paper focuses on the analysis and discussion of a likelihood ratio (LR) development for propositions at a hierarchical level known in the context as ‘offence level’. Existing literature on the topic has considered LR developments for so-called offender to scene transfer cases. These settings involve—in their simplest form—a single stain found on a crime scene, but with possible uncertainty about the degree to which that stain is relevant (i.e. that it has been left by the offender). Extensions to multiple stains or multiple offenders have also been reported. The purpose of this paper is to discuss a development of a LR for offence level propositions when case settings involve potential transfer in the opposite direction, i.e. victim/scene to offender transfer. This setting has previously not yet been considered. The rationale behind the proposed LR is illustrated through graphical probability models (i.e. Bayesian networks). The role of various uncertain parameters is investigated through sensitivity analyses as well as simulations.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2011-11-24
    Description: In this paper, we introduce methodology—causal directed acyclic graphs (DAGs)—that empirical researchers can use to identify causation, avoid bias, and interpret empirical results. This methodology is popular in a number of disciplines, including statistics, biostatistics, epidemiology and computer science, but has not yet appeared in the empirical legal literature. Accordingly, we outline the rules and principles underlying this methodology and then show how it can assist empirical researchers through both hypothetical and real-world examples found in the extant literature. While causal DAGs are not a panacea for all empirical problems, we show that they have potential to make the most basic and fundamental tasks, such as selecting covariate controls, relatively easy and straightforward.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2011-11-24
    Description: Legal analysis is dominated by legal arguments, and the assessment of any legal claim requires the assessment of the strengths and weaknesses of those arguments. The ‘logocratic’ method is a systematic method for assessing the strengths and weaknesses of arguments. More specifically, it is a method designed to help the analyst determine what degree of warrant the premises of an argument provide for its conclusion. Although the method is applicable to any type of argument, this essay focuses on the logocratic framework for assessing the strengths and weaknesses of evidentiary legal arguments, arguments offered in litigation in which evidentiary propositions are proffered to support hypotheses. The focus is on American law, but the logocratic analysis offered here could be adjusted without much trouble to handle arguments about evidence in other systems of litigation. In any legal system that aspires to have a fact-finding process that is sufficiently reliable to meet the requirements of justice, we might fashion an analogue for the Socratic maxim ‘the unexamined life is not worth living’: the unexamined evidentiary argument is not worth believing. The logocratic method seeks to help the evidence analyst pursue that Socratic mission, tailored to the rules and institutions of evidence law.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2011-11-24
    Description: The formal structure of decision-making under uncertainty used in legal trials bears a noteworthy similarity to the structure of decision-making under uncertainty used in hypothesis testing in empirical science. The first purpose of this article was to explicate those similarities. Secondly, the article reviews the historical origins of these decision-making schemes in both law and science, finding that they evolved independent of each other to serve similar functions.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2011-11-24
    Description: When many individual plaintiffs have similar claims against the same defendant, often it is more efficient for them to be combined into a single class action. Due to their increased complexity and larger stakes, in the USA there are special criteria a party seeking to proceed as a class action needs to satisfy. Statistical evidence is often submitted to establish that the members of the proposed class were affected by a common event or policy. In equal employment cases involving an employer with a number of locations or subunits, defendants may argue that the data should be examined separately for each unit, while plaintiffs may pool the data into one or several large samples or focus on a few units in which statistical significance was observed. After describing the statistical issues involved, it will be seen that requiring plaintiffs to demonstrate a statistically significant disparity in a pre-set fraction, e.g. majority of the subunits is too stringent as the power of the statistical test to detect a meaningful disparity in most subunits is too small. On the other hand, when many statistical tests are calculated on data from a fair system, a small percentage of significant disparities will be obtained. Thus, allowing a class action to proceed if the plaintiffs can demonstrate a statistically significant difference in a few subunits is too lax. The use of established methods for combining statistical tests for data organized by appropriate subgroups will be illustrated on data from two recent cases. Using the concept of power, the expected number, E , of subunits in which a statistically significant result would occur if there were a legally meaningful disparity can be determined. Then the observed number, O , of units with a significant disparity can be compared to E , to see whether data are consistent with a pattern, O close to E , indicating unfairness or O clearly less than E , reflecting fairness. Without such a comparison, the number of units with a statistically significant disparity is not meaningful. Both parties in Dukes v. Wal-mart introduced summaries of the p -values of many individual statistical tests that grouped them into a small number of categories. An appropriate overall procedure combines them into a single summary statistic. This analysis shows that the promotion data for the 40 or 41 regions in the Wal-mart case are consistent with an overall system in which the odds an eligible female being promoted were about 70–80% of those of a male. A similar analysis of the p -values of Wal-mart's subunit regressions also is consistent with a general pattern of a degree of underpayment of female employees relative to that of similarly qualified males.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2011-11-24
    Description: In 1970, Michael O. Finkelstein (with William B. Fairley) proposed that under some circumstances a jury in a criminal trial might be invited to use Bayes’ Theorem to address the issue of the identity of the criminal perpetrator. In 1971, Laurence Tribe responded with a rhetorically powerful and wide-ranging attack on what he called ‘trial by mathematics’. Finkelstein responded to Tribe's attack by further explaining, refining and defending his proposal. Although Tribe soon fell silent on the use of mathematical and formal methods to dissect or regulate uncertain factual proof in legal proceedings, the Finkelstein–Tribe exchange precipitated a decades-long debate about trial by mathematics. But that debate, which continues to this day, became generally unproductive and sterile years ago. This happened in part because two misunderstandings plagued much of the debate almost from the start. The first misunderstanding was a widespread failure to appreciate that mathematics is part of a broader family of rigorous methods of reasoning, a family of methods that is often called ‘formal’. The second misunderstanding was a widespread failure to appreciate that mathematical and formal analyses (including analyses that use numbers) can have a large variety of purposes. Before any further major research project on trial by mathematics is begun, interested researchers in mathematics, probability, logic and related fields, on the one hand, and interested legal professionals, on the other hand, should try to reach agreement about the possible distinct purposes that any given mathematical or formal analysis of inconclusive argument about uncertain factual hypotheses might serve. The article lists some of those possible purposes.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2011-11-24
    Description: This paper uses tools from argumentation and artificial intelligence to build a system to analyse reasoning from a motive to an action and reasoning from circumstantial evidence of actions to a motive. The tools include argument mapping, argumentation schemes, inference to the best explanation and a hybrid method of combining argument and explanation. Several examples of use of relevant motive evidence in law are used to illustrate how the system works. It is shown how adjudicating cases where motive of evidence is relevant depends on a balance of argumentation that can be tilted to one side or the other using plausible reasoning that combines arguments and explanations.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2011-11-24
    Description: It is investigated how implementing the likelihood ratio (LR) framework works out in the case of camera identification based on image-sensor-specific noise patterns. Two typical case scenarios are considered, one with images of low quality andthe other with images of high quality. In both cases, it is possible to obtain statistical distributions having a good fit with the reference data both for ‘matching’ and for ‘non-matching’ comparisons, and LRs are determined. It turns out that if the reference data are well separated, in the case of ‘matching’ images/cameras, the statistical fit of the distribution for ‘non-matches’ is constantly evaluated in a range where there is a lack of reference data. Because of this extrapolation issue, the LRsthat emerge are not reliable. This is not a problem that is unique to camera identification: if the informative value of any forensic comparison is high the problem emerges. An alternative approach is presented which consists of choosing a threshold value separating ‘matches’ from ‘non-matches’ and quantifying the strength of evidence of being larger/smaller than this value. If sample sizes of reference data increase LR results will increase as well, and it is shown that this approach is stable.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2011-11-24
    Description: The offence risk posed by individuals who are arrested, but where subsequently no charge or caution is administered, has been used as an argument for justifying the retention of such individuals' DNA and identification profiles. Here we consider the UK Home Office arrest-to-arrest data analysis, and find it to have limited use in indicating risk of future offence. In doing so, we consider the appropriateness of the statistical methodology employed and the implicit assumptions necessary for making such inference concerning the rearrest risk of a further individual. Additionally, we offer an alternative model that would provide an equally accurate fit to the data, but which would appear to have sounder theoretical justification and suggest alternative policy direction. Finally, we consider the implications of using such statistical inference in formulating national policy, and highlight a number of sociological factors that could be taken into account so as to enhance the validity of any future analysis.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2011-11-24
    Description: From the Wiley website: `Probability theory, implemented through graphical methods, specifically Bayesian networks, offers a powerful tool to deal with complex questions in forensic science and discover valid patterns in data. [This book] provides a unique and comprehensive introduction to the use of Bayesian networks for the evaluation of scientific evidence in forensic science.' The book opens with several introductory chapters on probability, Bayesian networks (BNs) and basic principles of evidence evaluation. There follow chapters on DNA evidence and transfer evidence such as fibres. The final chapters cover some more advanced general topics such as combinations of evidence, sensitivity analysis and qualitative and continuous networks. The level of discussion is reasonably elementary and at a leisurely pace, allowing an interested reader with little mathematical training to follow the arguments. But do not let me mislead you into thinking this is `light reading', the issues in forensic problems can be subtle and there are many aspects to consider, so one can easily get lost in complexities. The theoretical development is illustrated with simple examples worked through in considerable detail. Indeed, while the level of detail provided for the early examples will be appreciated by many readers, as the book advances the examples are still treated in great detail, which I feel precluded more substantial and realistic examples. For the very simple problems amenable to such treatment, the overhead in setting up a problem in the BN formalism can seem not worth the benefit. In practice, the reader is encouraged to use BN software to explore further examples on their own; specifically, the authors use the HUGIN package for which a `Lite' version is available free.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2014-06-04
    Description: Individualization, the claim to be able to reduce the potential donor pool of a forensic trace to a single source, has long been criticized. This criticism was echoed by a 2009 U.S. National Research Council report, which called such claims unsupportable for any discipline save nuclear DNA profiling. This statement demanded a response from those disciplines, such as fingerprint analysis, that have historically designated ‘individualization’ one of their approved testimonial conclusions. This article analyses three serial responses to this challenge by the U.S fingerprint profession. These responses posited new terms for testimonial reports or modified the definition of individualization. The article argues that these reforms have yet to ‘fix’ individualization and that all three reforms suffered semantic and conceptual difficulties. The article concludes by suggesting that these difficulties may be traced to the insistence on retaining, and somehow justifying, the term and concept ‘individualization’, instead of developing new terms and concepts from a defensible reasoning process.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2014-06-04
    Description: This case comment critiques the Supreme Court of Canada’s decision in R v. Mabior. In Mabior , Chief Justice McLachlin affirmed the criminalization of human immunodeficiency virus (HIV) non-disclosure to sexual partners, and sought to clarify exactly when criminal sanctions apply. Citing expert evidence, McLachlin CJC held that criminal liability is appropriate for HIV non-disclosure when there is a ‘realistic possibility of transmission’ and that only condom use combined with antiretroviral therapy reduces this risk enough to preclude liability. Using the same expert evidence, I calculate the transmission rates underlying this argument and show that McLachlin CJC’s use of statistics results in logical contradictions and uncertain liability. I argue that her statistical approach is unworkable and I propose an alternative non-disclosure regime.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2014-09-02
    Description: The conceptual foundations of burdens of proof are examined, and the unified theory of evidentiary devices derivable from those foundations is explicated. Both the conceptual foundations and the unified theory generated are shown to rest on questionable assumptions about conventional probability theory. The resulting analytical difficulties are analyzed. Inference to the best explanation and the relative plausibility theory are examined as potentially providing the foundation to a superior conceptualization of the burden of proof.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2014-09-02
    Description: Since the Human Rights Act 1998, scholars and courts have dedicated considerable attention to the presumption of innocence. A major strand of the ensuing debate has focused on the scope of this safeguard. Many academics have argued in favour of according to the presumption a substantive—as opposed to a procedural—role. In other words, these scholars maintain that the presumption set in art. 6(2) of the European Convention on Human Rights (ECHR) should have some influence on the definition of criminality. Courts seem sympathetic to this approach, albeit not following it to the full extent. The article, instead, defends a procedural understanding of the presumption of innocence, on the basis of interpretive arguments concerning art. 6(2) ECHR. Besides, it shows that adopting this conception does not entail lowering the protection of the individual before the substantive criminal law.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2014-09-02
    Description: In the academic literature, three approaches to rational legal proof are investigated, broadly speaking based, respectively on Bayesian statistics, on scenario construction and on argumentation. In this article, these approaches are discussed in light of a distinction between direct and indirect probabilistic reasoning. Direct probabilistic reasoning directly reasons from evidence to hypotheses, whereas indirect probabilistic reasoning reasons from hypotheses to evidence (and then back to the hypotheses). While statistical and story-based approaches usually model indirect probabilistic reasoning, argumentation-based approaches usually model direct probabilistic reasoning. It has been suggested that all legal probabilistic reasoning should be indirect, but in this article, it is argued that direct probabilistic reasoning has a rational basis and is, moreover, sometimes easier to perform for judges than indirect probabilistic reasoning. Moreover, direct probabilistic reasoning can be analysed in terms of standard probability theory, resulting in an alternative, non-Bayesian use of the terms ‘prior’ and ‘posterior’ probability and without the need to estimate unconditional probabilities of the hypotheses.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2014-09-02
    Description: Burdens and standards of proof are primarily concerned with minimizing the expected cost of error. This probabilistic goal explains both the levels at which general standards of proof are set, and the general allocation of burdens of proof to the plaintiff or prosecution. Variations from these general positions, achieved through presumptions and affirmative defences, can also be understood as directed towards minimizing expected error costs. This model describes the operation of many presumptions and defences, and also provides a normative basis for criticising presumptions and defences that fail to minimize the expected cost of error. However, it struggles with classes of cases where one side of the dispute faces systemic proof difficulties. Minimizing error costs would lead to an expectation of a serious imbalance in error rates; however, varying the standard of proof to equalize error rates would fail to minimize error costs. This is a genuine dilemma. No solution is offered.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2014-09-02
    Description: Mistakes in evidential reasoning can have severe consequences. Especially, errors in the use of statistics have led to serious miscarriages of justice. Fact-finders and forensic experts make errors in reasoning and fail to communicate effectively. As tools to prevent mistakes, three kinds of methods are available. Argumentative methods analyse the arguments and counterarguments that are presented in court. Narrative methods consider the construction and comparison of scenarios of what may have happened. Probabilistic methods show the connections between the probability of hypothetical events and the evidence. Each of the kinds of methods has provided useful normative maxims for good evidential reasoning. Argumentative and narrative methods are especially helpful for the analysis of qualitative information, but do not come with a formal theory that is as well-established as probability theory. In probabilistic methods, the emphasis is on numeric information, so much so that a standard criticism is that these methods require more numbers than are available. This article offers an integrating perspective on evidential reasoning, combining the strengths of each of the kinds of methods: the adversarial setting of arguments pro and con, the globally coherent perspective provided by scenarios, and the gradual uncertainty of probabilities. In the integrating perspective, arguments and scenarios are interpreted in the quantitative setting of standard probability theory. In this way, the integrated perspective provides a normative framework that bridges the communicative gap between fact-finders and forensic experts. Both qualitative and quantitative information can be used safely, focusing on what is relevant.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2014-09-02
    Description: Although courts have incorporated statistical hypothesis testing into their evaluation of numerical evidence in a variety of cases, they have primarily focused on one aspect of a statistical analysis: whether or not the result is ‘statistically significant’ at the 0.05 or ‘two-standard deviation’ level. The theory underlying hypothesis testing is also concerned with the power of the test to detect a meaningful difference. This article shows that using the insights provided by power calculations should assist courts to better interpret and evaluate the statistical analyses submitted into evidence. In particular, the concept of power should help in assessing whether a sample is too small to provide reliable inferences. On the other hand very large samples can classify minor differences as statistically significant. This occurs when the power of the test at the standard 0.05 level is very high. It will be seen that requiring significance at a more stringent level, e.g. 0.005, which can be determined from a power calculation, often resolves this problem.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2014-09-02
    Description: This article discusses rule-based presumptions that are authoritatively established, as distinct from other types of presumptions that are generalization-based or policy-based. It first introduces some legal distinctions that are used to define presumptions in law, and then presents extended examples of legal presumptions drawn from the statute and case law governing compensation for vaccine-related injuries in the USA. It proposes a formal method of representing rule-based legal presumptions that utilizes a three-valued, default logic. Finally, it uses the vaccine-injury compensation cases and the concept of legal presumption to explore difficulties in determining the burdens of production and persuasion, the meaning of legal terms in propositions to be proved and the inferences to be drawn from them.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2014-06-04
    Description: Interpretation concepts of scientific evidence have always been under discussion among forensic scientists and among all stakeholders of criminal proceedings in general. It seems that this issue has been attracting more attention since the introduction of the case assessment and interpretation (CAI) model in the late nineties and even more since the release of the National Academies of Science report ‘Strengthening Forensic Science in the United States’ in 2009. Following the debates there is, however, a certain danger of overcompensation if the input of stakeholders from e.g. inquisitorial criminal systems is under-represented. Without doubt, a likelihood ratio-based approach can be a powerful tool assisting in logically complex case assessments and judicial considerations of evidence. However, the application of this approach should be an option rather than an international standard as it concerns the concept of the stakeholder’s roles more profoundly in some countries than in others and may possibly take some countries by surprise. In the following article, this is discussed and some proposals are put forward which appear suitable to strengthen the evaluation of forensic results by the principle of methodological pluralism rather than by an exclusive and compulsory commitment to only one approach.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2014-06-04
    Description: Searching against larger Automated Fingerprint Identification System (AFIS) databases may increase the likelihood of finding a suspect in the database. However, Dror and Mnookin (2010) have argued that this also leads to an increase in the number of similar non-matching prints, which could lead to an erroneous identification. Using simulations, we explore the relation between database size and two outcome factors: close non-matching prints and overall database sensitivity, which is a measure of discriminability between true matches and close non-matches. We find that larger databases tend to increase both the likelihood of finding the suspect in the database as well as the number of close non-matching prints. However, the former tends to asymptote while the latter increases without bound, and this leads to an initial increase and then a decrease in the sensitivity of the database as more prints are added. This suggests the existence of an optimal database size, and that caution should be observed when interpreting results from larger databases. Quantitative evidentiary techniques such as likelihood ratios have the potential to address some of these concerns, although they too must consider the database size when calculating the likelihood ratio. Implications for practitioners are discussed.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2013-06-08
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2012-09-07
    Description: Inference in court is subject to scrutiny for structural correctness (e.g. deductive or non-monotonic validity) and probative weight in determinations such as logical relevancy and sufficiency of evidence. These determinations are made by judges or informally by jurors who typically have little, if any, training in formal or informal logical forms. This article explores the universal sufficiency of a single intuitive categorical natural language logical form (i.e. ‘defeasible class-inclusion transitivity’, DCIT) for facilitating such determinations and explores its effectiveness for constructing any typical inferential network in court. This exploration includes a comparison of the functionality of hybrid branching tree-like argument structures with the homogenous linear path argument structure of DCIT. The practicality of customary dialectical argument semantics and conceptions of probative weight are also examined with alternatives proposed. Finally, the issues of intelligibility and acceptability by end users in court of logical models are examined.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2012-09-07
    Description: The probabilistic representation of the proof is often criticized for failing adequately, to reflect the quantity or weight of evidence. There are cases (such as the naked statistical evidence hypotheticals) where a slight body of evidence appears to provide a strong measure of probabilistic support; however, a common intuition is that the evidence would lack sufficient weight to constitute legal proof. Further, if the probability measure has no relation to the weight of evidence, it would appear to express scepticism about the value of evidence. Why bother considering fresh evidence, increasing the weight of evidence, if it provides no epistemic benefit? Probability theory can avoid the spectre of scepticism. Fresh evidence should be considered because it offers the promise of increased certainty. In this article, I prove the relationship between quantity of evidence and these expected increases, and provide a computer model of the relationship. As the weight of evidence increases, greater certainty can be expected. Equivalently, on numerous runs of the model, greater certainty is achieved on average. But in particular cases, certainty may remain the same or decrease. The average and expected increase will be more accentuated where more decisive evidence is available, but the result still holds for situations where the evidence is merely probative. Notwithstanding that certainty can be expected to increase, the probability measure expected from considering fresh evidence is, by definition, equal to the prior probability measure. The model also throws doubt on the notion that an increase in the weight of evidence will bring greater stability or weight in the probability assessment. Indeed, the opposite appears to be the case. The more detailed evidence shifts cases from the general to the particular, and this is reflected in sharper movements in probability assessments.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2012-09-07
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2012-09-07
    Description: In this article, we provide a formal logical model of evidential reasoning with proof standards and burdens of proof, which enables us to evaluate evidential reasoning by comparing stories on either side of a case. It is based on a hybrid inference model that combines argumentation and explanation, using inference to the best explanation as the central form of argument. The model is applied to one civil case and two criminal cases. It is shown to have some striking implications for modelling and using traditional proof standards like preponderance of the evidence and beyond reasonable doubt.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2012-09-07
    Description: The article proposes a normative theory of inferential reasoning for criminal fact finding, centred on the concept of ‘analogy’. While evidence law scholars have devoted little attention to the topic, the article maintains that analogy deserves more consideration. In particular, it argues that an analogical theory of inferential reasoning has three main advantages. First, the theory makes it possible to incorporate within a single coherent framework the important insights of different approaches to ‘reasoning under uncertainty’; indeed, it welcomes both the Pascalian notion of ‘relevance’ based on the Bayesian likelihood ratio and the Baconian concept of ‘weight’. Secondly, it helps advance the conventional understanding of the reference class problem, an evidential conundrum widely discussed in the recent legal scholarship. Finally, the theory allows for a functional taxonomy of reasonable doubts.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2012-09-07
    Description: Fifty years of effort in artificial intelligence (AI) and the formalization of legal reasoning have produced both successes and failures. Considerable success in organizing and displaying evidence and its interrelationships has been accompanied by failure to achieve the original ambition of AI as applied to law: fully automated legal decision-making. The obstacles to formalizing legal reasoning have proved to be the same ones that make the formalization of commonsense reasoning so difficult, and are most evident where legal reasoning has to meld with the vast web of ordinary human knowledge of the world. Underlying many of the problems is the mismatch between the discreteness of symbol manipulation and the continuous nature of imprecise natural language, of degrees of similarity and analogy, and of probabilities.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2013-03-13
    Description: The article reviews the use of statistical tests to establish a prima facie case that an exam has a disparate impact on minorities. Two common scenarios are discussed. The first is that promotions are made in accordance with the ‘rank-order’ of the exam scores or a composite of the exam scores and some other factors. Courts have used several statistical tests in this situation, which may lead to conflicting conclusions from similar data. It will be shown that when the spreads of the exam scores in both groups are close to each other, the modified Wilcoxon test has desirable statistical properties. When the spreads of the exam scores of the two groups are noticeably different, a two-test procedure is proposed and shown to have higher power, especially when the spread of the minority scores is less than that of the majority. The second situation occurs when once an applicant passes the exam, s/he is eligible for further consideration and the actual exam score no longer matters. Courts may need to consider both the practical and statistical significance of the difference in pass rates. Small, unimportant differences may reach statistically significance when the numbers of applicants are large. In contrast, large differences in pass rates may not be detected as statistically significant in small samples. Two tables are provided to assist courts in reaching more consistent decisions when statistical and practical significance may not agree.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2013-03-13
    Description: The focus of the article is the application of modern Bayesianism in the context of forensic science, as advocated by many in England and Europe. The article reviews the many aspects of modern Bayesianism that extend well beyond the analytic truth of Bayes’s Theorem, and focuses, among other things, on the limits of what can be accomplished by the invocation of ‘subjective’ probabilities. In a sense, all probabilities are subjective, since they are all mind dependent. However, the important issue in forensic contexts, as in others, is not the subjective nature of invoked probabilities, but the characteristics of the belief warrant to be required for them in different decisional contexts.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2013-03-13
    Description: Judge Weinstein’s decision in Geressy, Rotolo and Jackson v. Digital Equipment provides a method for determining reasonable compensation regarding pain and suffering. This method relies on selecting a group of cases similar to that of the plaintiff: the ‘normative group’. We perform a statistical analysis on the cases used by Judge Weinstein to understand how he selected the normative group and how this selection relates to the quantification of pain and suffering. Classification trees find a rule that determines the normative group for each of the plaintiffs. Finally, a cumulative link model relates the variables used in the construction of the normative group to the quantification of pain and suffering.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2016-09-02
    Description: Thagard’s theory of explanatory coherence (TEC) and its implementation ECHO might be considered as the de facto calculus of explanatory coherence. It is an elaborate framework to compare competing scientific theories. Recently, it has become apparent that TEC is also useful as a tool for the analysis of different scenarios in so-called sense-making systems. To this end, it is expedient to discuss a number of extensions and modifications to TEC. This article proposes a number of extensions and modifications to TEC in the context of sense-making systems. The following topics are discussed: input format, representation of false formulas, representation languages, relaxation methods, schemes of coherence, meta-explanations, scenarios, leaking hypotheses, knowledge acquisition, and contextual explanation. The discussion is detailed enough to carry through changes in existing sense-making systems.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2016-09-02
    Description: A recent appeal ruling by the Supreme Court of South Australia, based in large part on the significance of the absence of an individual’s DNA on an item, has been successful in overturning a conviction. An issue of interest to the court in the original trial was the probability that two people could struggle and DNA not be detected on each other’s clothing. In the parlance of the hierarchy of propositions, the question of interest to the court was about potential activities, however using the DNA results the expert could only provide source level information and in the absence of DNA results such reporting is meaningless. In this article the circumstances of the case, the evidence at trial and the expert reports are examined to understand the circumstances that led to the initial conviction and subsequent successful appeal. The main body of this article is dedicated to helping evaluate the DNA profiling results considering competing posited activities to determine the value of the exclusionary evidence to the questions posed in court. I demonstrate the application of Bayesian theory and relevant literature studies on DNA transfer to assign a likelihood ratio, which in this case supported the defence version of events.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2016-09-02
    Description: Statistical issues arising in the certification phase of class action cases are illustrated by examining the data on promotion to the Management Trainee position in Dukes v. Wal-Mart II . After reviewing problems in the analyses submitted to the court by both parties, an alternative approach is presented. The high p -value of the analogue of the Breslow–Day homogeneity test suggests that the observed variability of the district-wide odds ratios of female to male promotions may result from random variation about a common odds ratio. The additional fact that 40 of 42 districts in California state had odds ratios less than one and that the two largest odds ratios barely exceeded one (1.081 and 1.118) justify a proper aggregation of the district level data in each region and ultimately a state-wide analysis. The estimated common odds ratio of 0.475 demonstrates that the data for all the districts in California state are consistent with a system in which women had about half the odds of being promoted than men. In addition, our analysis shows that even when there is a common pattern of a female disadvantage, there will be substantial variation in the differences between the observed and expected female shares of promotions in the individual districts due to random fluctuation. Neither party examined the data for homogeneity or applied an appropriate combination test, the court required that for plaintiffs to establish a common pattern, statistical significance needs to occur in at least one-half of districts of a region. A power analysis, not submitted to the court, indicates that this requirement is far too stringent.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2016-12-23
    Description: Conviction in criminal trials in the USA, the UK and many other common law countries requires establishing a defendant’s guilt beyond a reasonable doubt. By contrast, in Title IX proceedings at American colleges and universities, allegations of wrongdoing are adjudicated according to a much lower ‘preponderance of the evidence’ standard. Victims’ rights advocates correctly argue that a lower burden of proof makes it easier to ensure that the guilty are punished. But there is also a mathematically inevitable corollary: a lower burden of proof increases the probability of concluding that the innocent are guilty. This article provides a framework for using information regarding false conviction probabilities in criminal trials to model the probability of false guilty verdicts in Title IX proceedings in American colleges and universities. The quantitative results presented herein show that an innocent defendant faces a dramatically increased risk of conviction when tried under the preponderance of the evidence standard as opposed to under the beyond a reasonable doubt standard.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2016-12-23
    Description: Research on probabilistic reasoning has discovered several systematic errors, among which base rate neglect and the fallacy of the transposed conditional have featured prominently. This article introduces the term miss rate neglect to capture the systematic failure to properly account for false positives, i.e. the probability of evidence (E) given the hypothesis (H) is false, P(E|~H). Miss rate neglect occurs when decision makers (i) completely disregard the miss rate; (ii) underestimate the importance of differences in the miss rate, or (iii) overlook circumstances that affect the miss rate. We explain the relevance of miss rate neglect for legal decision making, review extant literature, present new experimental work that empirically validates options (ii) and (iii), and propose experimental variations that future research may pursue.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2016-12-23
    Description: The Federal Emergency Management Agency requires that buildings located in floodplains which undergo ‘substantial improvement’ must comply with current floodplain management standards. However, the wording of the ‘substantial improvement’ clause in the National Flood Insurance Program (44 C.F.R § 59.1) is highly ambiguous, leading to a loophole that can be easily exploited by property owners. We thus suggest a new interpretation of the law which we believe to be in the law’s intended spirit. We mathematize our new interpretation of the law and formally prove that it does not have a substantial impact on poorer property owners who are unable to make the needed improvements at one time.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    facet.materialart.
    Unknown
    Oxford University Press
    Publication Date: 2016-12-23
    Description: The conjunction paradox arises when a claim requires proof of multiple elements and the likelihood of some elements are at least partially independent of the likelihood of others. In that situation, probability theory may dictate that the conjunction of the elements is less likely than their disjunction, implying that a defendant should not be found liable, even though each element is probably true when considered in isolation. Nonetheless, American jury instructions reject this implication, and many scholars of proof have sought to construct normative theories to justify that rejection. This article collects and critiques two families of arguments about the conjunction paradox. First, I explain why an explanatory conception of proof cannot eliminate the paradox. Second, I show why various mathematical alternatives to standard probability theory are normatively deficient when applied to legal fact-finding. Instead, I suggest that the best way to resolve the paradox is through instructions that encourage juries to make appropriate adjustments for conjunctive and disjunctive likelihoods without having to frame their analyses in mathematical terms.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2014-03-05
    Description: Fact finders in legal trials often need to evaluate a mass of weak, contradictory and ambiguous evidence. There are two general ways to accomplish this task: by holistically forming a coherent mental representation of the case, or by atomistically assessing the probative value of each item of evidence and integrating the values according to an algorithm. Parallel constraint satisfaction models of cognitive coherence posit that a coherent mental representation is created by discounting contradicting evidence, inflating supporting evidence and interpreting ambivalent evidence in a way coherent with the emerging decision. This leads to inflated support for whichever hypothesis the fact finder accepts as true. Using a Bayesian network to model the direct dependencies between the evidence, the intermediate hypotheses and the main hypothesis, parameterised with (conditional) subjective probabilities elicited from the subjects, I demonstrate experimentally how an atomistic evaluation of evidence leads to a convergence of the computed posterior degrees of belief in the guilt of the defendant of those who convict and those who acquit. The atomistic evaluation preserves the inherent uncertainty that largely disappears in a holistic evaluation. Since the fact finders’ posterior degree of belief in the guilt of the defendant is the relevant standard of proof in many legal systems, this result implies that using an atomistic evaluation of evidence, the threshold level of posterior belief in guilt required for a conviction may often not be reached.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2014-03-05
    Description: Twenty states, the District of Columbia, and the federal government have adopted sexually violent predator (SVP) Laws, which permit the post-incarceration confinement of persons who: (1) have a previous conviction or charge for a sexual offence; (2) suffer from a mental abnormality; and (3) are likely to engage in future acts of sexual aggression. Although most who are convicted of a sexual offence will not be subject to SVP commitment, a burgeoning body of research indicates that commitment is highly likely once the decision is placed in the hands of the jury. The high rate of commitment suggests that there might be a presumption of dangerousness in these proceedings, possibly stemming from the previous conviction requirement. This potential explanation was tested in the current experiment. Jury-eligible participants ( n = 190) were provided with varying degrees of information pertaining to the SVP commitment criteria. Some participants were told only that a person had been referred for an SVP commitment proceeding, whereas others were given information relevant to some or all three of the legal criteria. The rate of commitment did not vary as a function of the information provided. The mere fact that a respondent had been referred for an SVP proceeding was sufficient for a majority of participants to authorize commitment. We then calculated participants’ implicit operationalization of the ‘likely to offend’ criterion. On average, participants require the risk of recidivism to exceed 31% (range 20–40%) to effectuate commitment. These findings raise concerns about whether the constitutionally required due process occurs in SVP commitment proceedings.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2014-03-05
    Description: In response to criticism of latent fingerprint evidence from a variety of authoritative extra-legal inquiries and reports, this essay describes the first iteration of a guide designed to assist with the reporting and interpretation of latent fingerprint evidence. Sensitive to the recommendations of these reports, we have endeavoured to incorporate emerging empirical evidence about the matching performance of fingerprint examiners (i.e. indicative error rates) into their testimony. We outline a way of approaching fingerprint evidence that provides a more accurate—in the sense of empirically and theoretically justified—indication of the value of fingerprint evidence than existing practice. It is an approach that could be introduced immediately. The proposal is intended to help non-experts understand the value of the evidence and improve its presentation and assessment in criminal investigations and proceedings. This first iteration accommodates existing empirical evidence and draws attention to the gap between the declaration of a match and positive identification (or individualization). Represented in this way, fingerprint evidence will be more consistent with its known value as well as the aims and conduct of the accusatorial trial.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2014-03-05
    Description: This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the ‘presence’ of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach—in a rational way—the decision component of the problem, that is judicial questions of the kind ‘Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?’. This is pointed out through a practical example.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2014-06-04
    Description: At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace—or is not aware of—the ‘normative’ character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2013-03-13
    Description: In equal employment cases concerning fair hiring or promotion, the number of eligible candidates often exceeds the number of available positions. When a group of plaintiffs show that they were discriminated against in the selection process, one cannot determine with certainty which ones would have been chosen. Several decisions from the Seventh Circuit observed that this situation is similar to the loss of chance in tort law where due to negligence the survival probability of a patient has been diminished. In both settings the plaintiffs’ loss can be regarded as probabilistic, i.e., in the discrimination context they lost their chance of obtaining the job or promotion. This article shows how survival analysis provide statistically sound estimates of the compensation due to a plaintiff. At each time an employment decision is made, all eligible candidates are considered. Job related factors such as seniority or special skill can be incorporated in the estimates of the probability each candidate would be employed or promoted. These probabilities are used to weight the salary differentials to provide an estimate of the lost salary. The loss in accrued pension benefits is also weighted by probability of being promoted before retirement. The methodology is illustrated on data from the Alexander v. Milwaukee promotion discrimination case. The survival analysis also confirmed the original finding of liability as the chances of promotion of white males were statistically significantly lower. Because seniority was an important factor our estimates differ from those suggested in the opinion which followed the Biondo v. City of Chicago decision. That opinion assumed a plaintiff who ultimately received a promotion would have been promoted during the period of discrimination. This assumption is questionable when seniority has a role since an individual’s seniority increases over time.
    Print ISSN: 1470-8396
    Electronic ISSN: 1470-840X
    Topics: Mathematics , Law
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...