Sign up with your email address to be the first to know about new products, VIP offers, blog features & more.
- Objective Bayesian Inference for Bilateral Data Cyr Emile M’lan and Ming-Hui Chen: this paper presents three objective Bayesian methods for analyzing bilateral data under Dallal’s model and the saturated model. Three parameters are of interest, namely, the risk difference, the risk ratio, and the odds ratio. The authors derive Jeffreys’ prior and Bernardo’s reference prior associated with the three parameters that characterize Dallal’s model. They also derive the functional forms of the posterior distributions of the risk difference and the risk ratio and discuss how to sample from their posterior distributions. The authors demonstrate the use of the proposed methodology with two real data examples. They also investigate small, moderate, and large sample properties of the proposed methodology and the frequentist counterpart via simulations.
- Ontology, Matter and Emergence Michel Bitbol: “Ontological emergence” of inherent high-level properties with causal powers is witnessed nowhere. A non-substantialist conception of emergence works much better. It allows downward causation, provided our concept of causality is transformed accordingly.
- Quantum mechanics in terms of realism Arthur Jabs: the author expounds an alternative to the Copenhagen interpretation of the formalism of non-relativistic quantum mechanics. The basic difference is that the new interpretation is formulated in the language of epistemological realism. It involves a change in some basic physical concepts. The ψ function is no longer interpreted as a probability amplitude of the observed behaviour of elementary particles but as an objective physical field representing the particles themselves. The particles are thus extended objects whose extension varies in time according to the variation of ψ. They are considered as fundamental regions of space with some kind of nonlocality. Special consideration is given to the Heisenberg relations, the reduction process, the problem of measurement, Schrodinger’s cat, Wigner’s friend, the Einstein-PodolskyRosen correlations, field quantization and quantum-statistical distributions.
- Between Laws and Models: Some Philosophical Morals of Lagrangian Mechanics J. Butterfield: the author extracts some philosophical morals from some aspects of Lagrangian mechanics. (A companion paper will present similar morals from Hamiltonian mechanics and Hamilton-Jacobi theory.) One main moral concerns methodology: Lagrangian mechanics provides a level of description of phenomena which has been largely ignored by philosophers, since it falls between their accustomed levels—“laws of nature” and “models”. Another main moral concerns ontology: the ontology of Lagrangian mechanics is both more subtle and more problematic than philosophers often realize. The treatment of Lagrangian mechanics provides an introduction to the subject for philosophers, and is technically elementary. In particular, it is confined to systems with a finite number of degrees of freedom, and for the most parteschews modern geometry.
- Dirac Processes and Default Risk Chris Kenyon, Andrew Green: the authors introduce Dirac processes, using Dirac delta functions, for short-rate-type pricing of financial derivatives. Dirac processes add spikes to the existing building blocks of diffusions and jumps. Dirac processes are Generalized Processes, which have not been used directly before because the dollar value of non-Real numbers is meaningless. However, short-rate pricing is based on integrals so Dirac processes are natural. This integration directly implies that jumps are redundant whilst Dirac processes expand expressivity of short-rate approaches. Practically, they demonstrate that Dirac processes enable high implied volatility for CDS swaptions that has been otherwise problematic inhazard rate setups.
- The Knowability Paradox in the light of a Logic for Pragmatics Massimiliano Carrara and Daniele Chiff: The Knowability Paradox is a logical argument showing that if all truths are knowable in principle, then all truths are, in fact, known. Many strategies have been suggested in order to avoid the paradoxical conclusion. A family of solutions – called logical revision – has been proposed to solve the paradox, revising the logic underneath, with an intuitionistic revision included. In this paper, the authors focus on so called revisionary solutions to the paradox – solutions that put the blame on the underlying logic. Specifically, they analyse a possible translation of the paradox into a modified intuitionistic fragment of a logic for pragmatics (KILP) inspired by Dalla Pozza and Garola in 1995. Their aim is to understand if KILP is a candidate for the logical revision of the paradox and to compare it with the standard intuitionistic solution to the paradox.
- A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause' Luke Fenton-Glynn: In their article ‘Causes and Explanations: A Structural-Model Approach. Part I: Causes’, Joseph Halpern and Judea Pearl draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. It is shown here that their account can be naturally extended to provide an elegant treatment of probabilistic causation.
- Faster Statistical Model Checking for Unbounded Temporal Properties Przemysław Daca, Thomas A. Henzinger, Jan Křetínský, Tatjana Petrov: The authors present a new algorithm for the statistical model checking of Markov chains with respect to unbounded temporal properties, such as reachability and full linear temporal logic. The main idea is that they monitor each simulation run on the fly, in order to detect quickly if a bottom strongly connected component is entered with high probability, in which case the simulation run can be terminated early. As a result, the authors' simulation runs are often much shorter than required by termination bounds that are computed a priori for a desired level of confidence and size of the state space. In comparison to previous algorithms for statistical model checking, for a given level of confidence, the authors' method is not only faster in many cases but also requires less information about the system, namely, only the minimum transition probability that occurs in the Markov chain. In addition, the method can be generalised to unbounded quantitative properties such as mean-payoff bounds.
- Conditions for positioning of nucleosomes on DNA Michael Sheinman, Ho-Ryun Chung:(excellent application of physics to biology) - Positioning of nucleosomes along eukaryotic genomes plays an important role in their organization and regulation. There are many different factors affecting the location of nucleosomes. Some can be viewed as preferential binding of a single nucleosome to different locations along the DNA and some as interactions between neighboring nucleosomes. In this study we analyzed how well nucleosomes are positioned along the DNA as a function of strength of the preferential binding, correlation length of the binding energy landscape, interactions between neighboring nucleosomes and others relevant system properties. The authors analyze different scenarios: designed energy landscapes and generically disordered ones and derive conditions for good positioning. Using analytic and numerical approaches they find that, even if the binding preferences are very weak, synergistic interplay between the interactions and the binding preferences is essential for a good positioning of nucleosomes, especially on correlated energy landscapes. Analyzing empirical energy landscape, they discuss relevance of our theoretical results to positioning of nucleosomes on DNA in vivo.
- The Delicacy of Counterfactuals in General Relativity Erik Curiel: General relativity poses serious problems for counterfactual propositions peculiar to it as a physical theory, problems that have gone unremarked on in the physics and in the philosophy literature. Because these problems arise from the dynamical nature of spacetime geometry, they are shared by all schools of thought on how counterfactuals should be interpreted and understood. Given the role of counterfactuals in the characterization of, inter alia, many accounts of scientific laws, theory-confirmation and causation, general relativity once again presents us with idiosyncratic puzzles any attempt to analyze and understand the nature of scientific knowledge and of science itself must face.
- Information, learning and falsification David Balduzzi: Broadly speaking, there are two approaches to quantifying information. The first, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out. The second, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it. Shannon information provides the mathematical foundation for communication and coding theory. Algorithmic information has been applied by Solomonoff and Hutter to prove remarkable results on universal induction. However, both approaches have shortcomings. Algorithmic information is not computable, severely limiting its practical usefulness. Shannon information refers to ensembles rather than actual events: it makes no sense to compute the Shannon information of a single string – or rather, there are many answers to this question depending on how a related ensemble is constructed. Although there are asymptotic results linking algorithmic and Shannon information, it is unsatisfying that there is such a large gap – a difference in kind – between the two measures. This note describes a new method of quantifying information, effective information, that links algorithmic information to Shannon information, and also links both to capacities arising in statistical learning theory. After introducing the measure, the author shows that it provides a non-universal analog of algorithmic information. It is then applied to derive basic capacities in statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. A nice byproduct of this approach is an interpretation of the explanatory power of a learning algorithm in terms of the number of hypotheses it falsifies (counted in two different ways for the two different capacities). it is also discussed how effective information relates to information gain, Shannon and mutual information. The author concludes by discussing some broader implications.
- Weighing Explanations Daniel Star, Stephen Kearns: The primary goal of John Broome’s new book, Rationality Through Reasoning (2013), is to outline and defend an account of reasoning that makes it clear how it is possible to actively become more rational by means of engaging in reasoning. In the process Broome finds it necessary to also provide his own accounts of ought, reasons, and requirements. The authors will focus here on the account of reasons. This is not the first time they have done so. In an earlier paper (Kearns and Star 2008), they contrasted Broome’s account with their own favored account of reasons (reasons as evidence). Although there are some differences between the views defended in the relevant chapters of Broome’s book (chs. 3 and 4) and the draft manuscript and earlier papers that the authors used as the basis of their discussion in that earlier paper, these do not, for the most part, substantially affect the authors' earlier arguments. It is clear that in articulating an alternative account of reasons we were also heavily influenced by Broome, so we are particularly grateful to have this opportunity to contribute a piece to a 'Festschrift' for him. In a response to the authors and some other critics, Broome (2008) presented some challenges for their account of reasons, but did not address their criticisms of his own account (and they responded, in turn, to Broome’ schallenges in Kearns and Star 2013). Here they will first provide updated versions of our earlier concerns, since they mostly still seem pertinent. Then the authors will turn to provide a fresh response to his account of reasons that focuses on the notion of a weighing explanation. On Broome’s account, 'pro tanto' reasons are facts cited in weighing explanations of what one ought to do; facts that have weights. It is not clear what the idea that pro tanto reasons have weights really amounts to. While recognizing that a simple analogy with putative non-normative weighing explanations involving physical weights initially seems helpful, the authors argue that the notion of a weighing explanation, especially a normative weighing explanation, does not ultimately stand up to scrutiny.
Ascribing Consciousness to Artificial Intelligence Murray Shanahan: This paper critically assesses the anti-functionalist stance on consciousness adopted by certain advocates of integrated information theory (IIT), a corollary of which is that human-level artificial intelligence implemented on conventional computing hardware is necessarily not conscious. The critique draws on variations of a well-known gradual neuronal replacement thought experiment, as well as bringing out tensions in IIT’s treatment of self-knowledge. The aim, though, is neither to reject IIT outright nor to champion functionalism in particular. Rather, it is suggested that both ideas have something to offer a scientific understanding of consciousness, as long as they are not dressed up as solutions to illusory metaphysical problems. As for human-level AI, we must await its development before we can decide whether or not to ascribe consciousness to it.
- Why Physics Needs Philosophy Tim Maudlin: "Philosophy cannot be killed by any scientific or logical reasoning: just think about that": Many questions about the nature of reality cannot be properly pursued without contemporary physics. Inquiry into the fundamental structure of space, time and matter must take account of the theory of relativity and quantum theory. Philosophers accept this. In fact, several leading philosophers of physics hold doctorates in physics. Yet they chose to affiliate with philosophy departments rather than physics departments because so many physicists strongly discourage questions about the nature of reality. The reigning attitude in physics has been “shut up and calculate”: solve the equations, and do not ask questions about what they mean. But putting computation ahead of conceptual clarity can lead to confusion. Take, for example, relativity’s iconic “twin paradox.” Identical twins separate from each other and later reunite. When they meet again, one twin is biologically older than the other. (Astronaut twins Scott and Mark Kelly are about to realize this experiment: when Scott returns from a year in orbit in 2016 he will be about 28 microseconds younger than Mark, who is staying on Earth.) No competent physicist would make an error in computing the magnitude of this effect. But even the great Richard Feynman did not always get the explanation right. In “The Feynman Lectures on Physics,” he attributes the difference in ages to the acceleration one twin experiences: the twin who accelerates ends up younger. But it is easy to describe cases where the opposite is true, and even cases where neither twin accelerates but they end up different ages. The calculation can be right and the accompanying explanation wrong. If your goal is only to calculate, this might be sufficient. But understanding existing theories and formulating new ones requires more. Einstein arrived at the theory of relativity by reflecting on conceptual problems rather than on empirical ones. He was primarily bothered by explanatory asymmetries in classical electromagnetic theory. Physicists before Einstein knew, for instance, that moving a magnet in or near a coil of wire would induce an electric current in the coil. But the classical explanation for this effect appeared to be entirely different when the motion was ascribed to the magnet as opposed to the coil; the reality is that the effect depends only on the relative motion of the two. Resolving the explanatory asymmetry required rethinking the notion of simultaneity and rejecting the classical account of space and time. It required the theory of relativity. Comprehending quantum theory is an even deeper challenge. What does quantum theory imply about “the nature of reality?” Scientists do not agree about the answer; they even disagree about whether it is a sensible question. The problems surrounding quantum theory are not mathematical. They stem instead from the unacceptable terminology that appears in presentations of the theory. Physical theories ought to be stated in precise terminology, free of ambiguity and vagueness. What philosophy offers to science, then, is not mystical ideas but meticulous method. Philosophical skepticism focuses attention on the conceptual weak points in theories and in arguments. It encourages exploration of alternative explanations and new theoretical approaches. Philosophers obsess over subtle ambiguities of language and over what follows from what. When the foundations of a discipline are secure this may be counter-productive: just get on with the job to be done! But where secure foundations (or new foundations) are needed, critical scrutiny can suggest the way forward. The search for ways to marry quantum theory with general relativity would surely benefit from precisely articulated accounts of the foundational concepts of these theories, even if only to suggest what must be altered or abandoned. Philosophical skepticism arises from the theory of knowledge, the branch of philosophy called “epistemology.” Epistemology studies the grounds for our beliefs and the sources of our concepts. It often reveals tacit presuppositions that may prove wrong, sources of doubt about how much we really know.
- Towards A Mathematical Theory Of Complex Socio-Economical Systems By Functional Subsystems Representation Giulia Ajmone Marsan, Nicola Bellomo, Massimo Egidi, Luiss Guido Carli: This paper deals with the development of a mathematical theory for complex socio-economical systems. The approach is based on the methods of the mathematical kinetic theory for active particles, which describes the evolution of large systems of interacting entities which are carriers of specific functions, in our case economical activities. The method is implemented with the concept of functional subsystems constituted by aggregated entities which have the ability of expressing socio-economical purposes and functions.
- The generalised quasispecies Raphael Cerf and Joseba Dalmau: really excellent study in population-dynamics and macro-evolution - the authors study Eigen’s quasispecies model in the asymptotic regime where the length of the genotypes goes to ∞ and the mutation probability goes to 0. They then give several explicit formulas for the stationary solutions of the limiting system of differential equations.
- Evolutionary Prediction Games Jeffrey A. Barrett, Michael Dickson, Gordon Purves: the authors consider an extension of signaling games to the case of prediction, where one agent (‘sender’) perceives the current state of the world and sends a signal. The second agent (‘receiver’) perceives this signal, and makes a prediction about the next state of the world (which evolves according to stochastic but not entirely random ‘laws’). They then suggest that such games may be the basis of a model for the evolution of successful theorizing about the world.
- Sigma-Point Filtering Based Parameter Estimation in Nonlinear Dynamic System Juho Kokkala, Arno Solin, and Simo Särkkä: the authors consider approximate maximum likelihood parameter estimation in non-linear state-space models. They discuss both direct optimization of the likelihood and expectation–maximization (EM). For EM, they also give closed-form expressions for the maximization step in a class of models that are linear in parameters and have additive noise. To obtain approximations to the filtering and smoothing distributions needed in the likelihood-maximization methods, the authors focus on using Gaussian filtering and smoothing algorithms that employ sigma-points to approximate the required integrals. They discuss different sigma point schemes based on the third, fifth, seventh, and ninth order unscented transforms and Gauss–Hermite quadrature rule. They compare the performance of the methods in two simulated experiments: a univariate toy model as well as tracking of a maneuvering target. In the experiments, the authors also compare against approximate likelihood estimates obtained by particle filtering and extended Kalman filtering based methods. The experiments suggest that the higher-order unscented transforms may in some cases provide more accurate estimates.
- Is Howard’s Separability Principle a sufficient condition for Outcome Independence? Paul Boes: Howard [1985, 1989, 1992] has argued that the, experimentally confirmed, violation of the Bell inequalities forces us to to reject at least one of two physical principles, which he terms locality and separability principle. To this end, he provides a proof [Howard, 1992] of the equivalence of the separability condition, a formal condition to which the separability principle gives rise, with the condition of “outcome independence”. If this proof is sound, then Howard’s claim would gain strong support in that “outcome independence” and “parameter independence”, where the latter arises from Howard’s locality principle, have been shown by [Jarrett, 1984] to conjunctively constitute a necessary condition for the derivation of the Bell inequalities [Clauser and Horne, 1974]. However, Howard’s proof has been contested in a number of ways. In this essay the author will discuss several criticisms of Howard’s equivalence proof that focus on the sufficiency of the separability principle for outcome independence. Paul then will argue that, while none of these criticisms succeeds, they do constrain the possible form of Howard’s argument. To do so, he will first introduce both the separability principle and outcome independence in the context of EPR-like experiments before discussing the individual arguments.
- Reflected Backward Stochastic Differential Equations When The Obstacle Is Not Right-Continuous And Optimal Stopping Miryana Grigorova, Peter Imkeller, Elias Offen, Youssef Ouknine, Marie-Claire Quenez: In the first part of the paper, the authors study reflected backward stochastic differential equations (RBSDEs) with lower obstacle which is assumed to be right upper-semicontinuous but not necessarily right-continuous. They prove existence and uniqueness of the solutions to such RBSDEs in appropriate Banach spaces. The result is established by using some tools from the general theory of processes such as Mertens decomposition of optional strong (but not necessarily rightcontinuous) supermartingales, some tools from optimal stopping theory, as well as an appropriate generalization of Itô’s formula due to Gal’chouk and Lenglart. In the second part of the paper, the authors provide some links between the RBSDE studied in the first part and an optimal stopping problem in which the risk of a financial position ξ is assessed by an f-conditional expectation E(f) (where f is a Lipschitz driver). They characterize the "value function" of the problem in terms of the solution to our RBSDE. Under an additional assumption of left upper-semicontinuity on ξ, they show the existence of an optimal stopping time. They also provide a generalization of Mertens decomposition to the case of strong E f-supermartingales.
- Generalized Support and Formal Development of Constraint Propagators (On Artificial Intelligence) James Caldwell, Ian P. Gent, Peter Nightingale: The concept of support is pervasive in constraint programming. Traditionally, when a domain value ceases to have support, it may be removed because it takes part in no solutions. Arc-consistency algorithms such as AC2001 make use of support in the form of a single domain value. GAC algorithms such as GAC-Schema use a tuple of values to support each literal. The authors generalize these notions of support in two ways. First, they allow a set of tuples to act as support. Second, the supported object is generalized from a set of literals (GAC-Schema) to an entire constraint or any part of it. They also design a methodology for developing correct propagators using generalized support. A constraint is expressed as a family of support properties, which may be proven correct against the formal semantics of the constraint. Using CurryHoward isomorphism to interpret constructive proofs as programs, they show how to derive correct propagators from the constructive proofs of the support properties. The framework is carefully designed to allow efficient algorithms to be produced. Derived algorithms may make use of dynamic literal triggers or watched literals for efficiency. Finally, two case studies of deriving efficient algorithms are given.
- An Algorithm Set Revolutionizes 3-D Protein Structure Discovery A new way to determine 3-D structures from 2-D images is set to speed up protein structure discovery by a factor of 100,000. Via 'Emerging Technology From the arXiv'/MIT Technological Review.
- Finite relation algebras and omitting types in modal fragments of first order logic Tarek Sayed Ahmed: Let 2 < n ≤ l < m < ω. Let Ln denote first order logic restricted to the first n variables. It is shown that the omitting types theorem fails dramatically for the n–variable fragments of first order logic with respect to clique guarded semantics, and for its packed n–variable fragments. Both are modal fragments of Ln. As a sample, the author shows that if there exists a finite relation algebra with a so–called strong l–blur, and no m–dimensional relational basis, then there exists a countable, atomic and complete Ln theory T and type Γ, such that Γ is realizable in every so–called m–square model of T, but any witness isolating Γ cannot use less than l variables. An m–square model M of T gives a form of clique guarded semantics, where the parameter m, measures how locally well behaved M is. Every ordinary model is k–square for any n < k < ω, but the converse is not true. Any model M is ω–square, and the two notions are equivalent if M is countable. Such relation algebras are shown to exist for certain values of l and m like for n ≤ l < ω and m = ω, and for l = n and m ≥ n + 3. The case l = n and m = ω gives that the omitting types theorem fails for Ln with respect to (usual) Tarskian semantics: There is an atomic countable Ln theory T for which the single non–principal type consisting of co–atoms cannot be omitted in any model M of T. For n < ω, positive results on omitting types are obtained for Ln by imposing extra conditions on the theories and/or the types omitted. Positive and negative results on omitting types are obtained for infinitary variants and extensions of Lω,ω.
- Can Quantum Analogies Help Us to Understand the Process of Thought? Paavo Pylkkanen: A number of researchers today make an appeal to quantum physics when trying to develop a satisfactory account of the mind, an appeal still felt to be controversial by many. Often these “quantum approaches” try to explain some well-known features of conscious experience (or mental processes more generally), thus using quantum physics to enrich the explanatory framework or explanans used in consciousness studies and cognitive science. This paper considers the less studied question of whether quantum physical intuitions could help us to draw attention to new or neglected aspects of the mind in introspection, and in this way change our view about what needs explanation in the first place. Although prima facie implausible, it is suggested that this could happen, for example, if there were analogies between quantum processes and mental processes (e.g., the process of thinking). The naive idea is that such analogies would help us to see mental processes and conscious experience in a new way. It has indeed been proposed long ago that such analogies exist, and this paper first focuses at some length on David Bohm’s formulation of them from 1951. It then briefly considers these analogies in relation to Smolensky’s more recent analogies between cognitive science and physics, and Pylkk¨o’s aconceptual view of the mind. Finally, Bohm’s early analogies will be briefly considered in relation to the analogies between quantum processes and the mind he proposed in his later work.
- Researchers are demonstrating that, in certain contexts, namely AdS Spaces - AdS/CFT Correspondence 'duality', string theory is the only consistent theory of quantum gravity: Might this make it true? By Natalie Wolchover, via Quanta Magazine: Thirty years have passed since a pair of physicists, working together on a stormy summer night in Aspen, Colo., realized that string theory might have what it takes to be the “theory of everything.” “We must be getting pretty close,” Michael Green recalls telling John Schwarz as the thunder raged and they hammered away at a proof of the theory’s internal consistency, “because the gods are trying to prevent us from completing this calculation.” Their mathematics that night suggested that all phenomena in nature, including the seemingly irreconcilable forces of gravity and quantum mechanics, could arise from the harmonics of tiny, vibrating loops of energy, or “strings.” The work touched off a string theory revolution and spawned a generation of specialists who believed they were banging down the door of the ultimate theory of nature. But today, there’s still no answer. Because the strings that are said to quiver at the core of elementary particles are too small to detect — probably ever — the theory cannot be experimentally confirmed. Nor can it be disproven: Almost any observed feature of the universe jibes with the strings’ endless repertoire of tunes. The publication of Green and Schwarz’s paper “was 30 years ago this month,” the string theorist and popular-science author Brian Greene wrote in Smithsonian Magazine in January, “making the moment ripe for taking stock: Is string theory revealing reality’s deep laws? Or, as some detractors have claimed, is it a mathematical mirage that has sidetracked a generation of physicists?” Greene had no answer, expressing doubt that string theory will “confront data” in his lifetime. Recently, however, some string theorists have started developing a new tactic that gives them hope of someday answering these questions. Lacking traditional tests, they are seeking validation of string theory by a different route. Using a strange mathematical dictionary that translates between laws of gravity and those of quantum mechanics, the researchers have identified properties called “consistency conditions” that they say any theory combining quantum mechanics and gravity must meet. And in certain highly simplified imaginary worlds, they claim to have found evidence that the only consistent theories of “quantum gravity” involve strings.
- A Conundrum in Bayesian Epistemology of Disagreement by Tomoji Shogenji (credit for posting here goes to Andrew Teasdale): The proportional weight view in epistemology of disagreement generalizes the equal weight view and proposes that we assign to the judgments of different people weights that are proportional to their epistemic qualifications. It is known that (under the plausible Context-Free Assumption) if the resulting aggregate degrees of confidence are to constitute a probability function, they must be the weighted arithmetic means of individual degrees of confidence, but aggregation by the weighted arithmetic means violates the Bayesian rule of conditionalization. The double bind entails that the proportional weight view is inconsistent with Bayesianism. The paper explores various ways to respond to this challenge to the proportional weight view. - The Fine-Tuning Argument - Klaas Landsman: are the laws of nature and our cosmos delicately fine-tuned for life to emerge as it appears to be the case?
- First Quantum Music Composition Unveiled Physicists have mapped out how to create quantum music, an experience that will be profoundly different for every member of the audience, they say. Via MIT Technology Review.
- Definitional Argument in Evolutionary Psychology and Cultural Anthropology John P. Jackson, Jr: The role of disciplinary history in the creation and maintenance of disciplinary autonomy and authority has been a target of scholarly inquiry at least since Thomas Kuhn’s (1970) claim that such histories were key indicators of a reigning paradigm. In the United States, the history of psychology is a recognized subdiscipline of psychology and histories of psychology serve to inculcate students into psychology as well as to establish and maintain the authority of research programs (Ash 1983; Leahey 1992; Samelson 1997; Samelson 2000). We should not be surprised, therefore to find evolutionary psychologists to appeal to the history of the social sciences when they make their appeals for the necessity and value of their nascent discipline. In this paper the author will examine how evolutionary psychologists use the history of science in order to create space for their new discipline. In particular, he is interested in how they employ a particular account of the origins of American cultural anthropology at the beginning of the twentieth century. Evolutionary psychologists offer a particular history of cultural anthropology as an argument for why we now need evolutionary psychology. John will show that each discipline (EP and anthropology) attempted to create space for itself by defining a central term, “culture.” In defining “culture” each discipline also defined their scientific program: defining the nature of scientific inquiry by defining the central object of study. These definitional moves are not necessarily explicit in the argument, however; rather than arguments about definition, these scientists are offering an argument by definition. An argument by definition should not be taken to be an argument about (or from) a definition. In some sense, an argument by definition does not appear to be an argument at all: The key definitional move is simply stipulated, as if it were a natural step along the way of justifying some other claim…. One cannot help noticing an irony here. Definition of terms is a key step in the presentation of argument, and yet this critical step is taken by making moves that are not themselves argumentative at all. They are not claims supported by reasons and intended to justify adherence by critical listeners. Instead they are simply proclaimed as if they were indisputable facts.
- Large Margin Nearest Neighbor Embedding for Knowledge Representation - Miao Fan et al: on Artificial Intelligence and learning algorithms to efficiently find the optimal solution based on Stochastic Gradient Descent in iterative fashion.
- A Categorial Semantic Representation of Quantum Event Structures E. Zafiris, V. Karakostas: The overwhelming majority of the attempts in exploring the problems related to quantum logical structures and their interpretation have been based on an underlying set-theoretic syntactic language: could a shift to a category-theoretic 'mode' be better in explaining the global structure of a quantum algebra of events (or propositions) in terms of sheaves of local Boolean frames?