Sign up with your email address to be the first to know about new products, VIP offers, blog features & more.
- Explanation in Biology: An Enquiry into the Diversity of Explanatory Patterns in the Life Sciences Pierre-Alain Braillard and Christophe Malaterre: Despite the philosophical clash between deductive-nomological and mechanistic accounts of explanation, in scientific practice, both approaches are required in order to achieve more complete explanations and guide the discovery process. Here, this thesis is defended by discussing the case of mathematical models in systems biology. Not only such models complement the mechanistic explanations of molecular biology by accounting for poorly understood aspects of biological phenomena, they can also reveal unsuspected ‘black boxes’ in mechanistic explanations, thus prompting their revision while providing new insights about the causal-mechanistic structure of the world.
- Realism and instrumentalism about the wave function. How should we choose? Mauro Dorato: The main claim of the paper is that one can be a ‘realist’ (in some sense) about quantum mechanics without requiring any form of realism about the wave function. The author begins by discussing various forms of realism about the wave function, namely Albert’s configuration-space realism, Dürr Zanghi and Goldstein’s nomological realism about Ψ, Esfeld’s dispositional reading of Ψ and Pusey Barrett and Rudolph’s realism about the quantum state. By discussing the articulation of these four positions, and their interrelation, we conclude that instrumentalism about Ψ is by itself not sufficient to choose one over the other interpretations of quantum mechanics, thereby confirming in a different way the indetermination of the metaphysical interpretations of quantum mechanics.
- On the sufficiency of pairwise interactions in maximum entropy models of biological networks Lina Merchan, Ilya Nemenman: Biological information processing networks consist of many components, which are coupled by an even larger number of complex multivariate interactions. However, analyses of data sets from fields as diverse as neuroscience, molecular biology, and behavior have reported that observed statistics of states of some biological networks can be approximated well by maximum entropy models with only pairwise interactions among the components. Based on simulations of random Ising spin networks with p-spin (p > 2) interactions, the authors argue that this reduction in complexity can be thought of as a natural property of densely interacting networks in certain regimes, and not necessarily as a special property of living systems. By connecting their analysis to the theory of random constraint satisfaction problems, they suggest a reason for why some biological systems may operate in this regime.
- A Novel Plasticity Rule Can Explain the Development of Sensorimotor Intelligence Ralf Der and Georg Martius: Grounding autonomous behavior in the nervous system is a fundamental challenge for neuroscience. In particular, the self-organized behavioral development provides more questions than answers. Are there special functional units for curiosity, motivation, and creativity? This paper argues that these features can be grounded in synaptic plasticity itself, without requiring any higher level constructs. The authors propose differential extrinsic plasticity (DEP) as a new synaptic rule for self-learning systems and apply it to a number of complex robotic systems as a test case. Without specifying any purpose or goal, seemingly purposeful and adaptive behavior is developed, displaying a certain level of sensorimotor intelligence. These surprising results require no system specific modifications of the DEP rule but arise rather from the underlying mechanism of spontaneous symmetry breaking due to the tight brain-body-environment coupling. The new synaptic rule is biologically plausible and it would be an interesting target for a neurobiolocal investigation. The authors also argue that this neuronal mechanism may have been a catalyst in natural evolution.
- From survivors to replicators: Evolution by natural selection revisited Pierrick Bourrat: For evolution by natural selection to occur it is classically admitted that the three ingredients of variation, difference in fitness and heredity are necessary and sufficient. In this paper, the author shows using simple individual-based models, that evolution by natural selection can occur in populations of entities in which neither heredity nor reproduction are present. Furthermore, he demonstrates by complexifying these models that both reproduction and heredity are predictable Darwinian products (i.e. complex adaptations) of populations initially lacking these two properties but in which new variation is introduced via mutations. Later on, the author shows that replicators are not necessary for evolution by natural selection, but rather the ultimate product of such processes of adaptation. Finally, he assesses the value of these models in three relevant domains for Darwinian evolution.
Game theory elucidates the collective behavior of bosons Quantum particles behave in strange ways and are often difficult to study experimentally. Using mathematical methods drawn from game theory, LMU physicists have shown how bosons, which like to enter the same state, can form multiple groups. When scientists explore the mysterious behavior of quantum particles, they soon reach the limits of present-day experimental research. From there on, progress is only possible with the aid of theoretical ideas. NIM investigator Prof. Erwin Frey and his team at the Dept. of Statistical and Biological Physics (LMU Munich) have followed this route to study the behavior of bosons. Bosons are quantum particles that like to cluster together. But by applying methods from the mathematical field of game theory, the Munich physicists were able to explain why and under what conditions bosons form multiple groups.
Correlation of action potentials in adjacent neurons M. N. Shneider and M. Pekker: A possible mechanism for the synchronization of action potential propagation along a bundle of neurons (ephaptic coupling) is considered. It is shown that this mechanism is similar to the salutatory conduction of the action potential between the nodes of Ranvier in myelinated axons. The proposed model allows the authors to estimate the scale of the correlation, i.e., the distance between neurons in the nervous tissue, where in their synchronization becomes possible. The possibility for experimental verification of the proposed model of synchronization is discussed.
- Explaining the Unobserved—Why Quantum Mechanics Ain’t Only About Information Amit Hagar and Meir Hemmo: A remarkable theorem by Clifton, Bub and Halvorson (2003) (CBH) characterizes quantum theory in terms of information–theoretic principles. According to Bub (2004, 2005) the philosophical significance of the theorem is that quantum theory should be regarded as a “principle” theory about (quantum) information rather than a “constructive” theory about the dynamics of quantum systems. Here the authors criticize Bub’s principle approach arguing that if the mathematical formalism of quantum mechanics remains intact then there is no escape route from solving the measurement problem by constructive theories. They further propose a (Wigner–type) thought experiment that they argue demonstrates that quantum mechanics on the information–theoretic approach is incomplete.
Metareasoning for Planning Under Uncertainty Christopher H. Lin, Andrey Kolobov, Ece Kamar, and Eric Horvitz: The conventional model for online planning under uncertainty assumes that an agent can stop and plan without incurring costs for the time spent planning. However, planning time is not free in most real-world settings. For example, an autonomous drone is subject to nature’s forces, like gravity, even while it thinks, and must either pay a price for counteracting these forces to stay in place, or grapple with the state change caused by acquiescing to them. Policy optimization in these settings requires metareasoning—a process that trades off the cost of planning and the potential policy improvement that can be achieved. The authors formalize and analyze the metareasoning problem for Markov Decision Processes (MDPs). Their work subsumes previously studied special cases of metareasoning and shows that in the general case, metareasoning is at most polynomially harder than solving MDPs with any given algorithm that disregards the cost of thinking. For reasons the authors discuss, optimal general metareasoning turns out to be impractical, motivating approximations. They present approximate metareasoning procedures which rely on special properties of the BRTDP planning algorithm and explore the effectiveness of our methods on a variety of problems.
- Can the brain map 'non-conventional' geometries (and abstract spaces)? Grid cells, space-mapping neurons of the entorhinal cortex of rodents, could also work for hyperbolic surfaces. A SISSA study just published in Interface, the journal of the Royal Society, tests a model (a computer simulation) based on mathematical principles, that explains how maps emerge in the brain and shows how these maps adapt to the environment in which the individual develops. "It took human culture millennia to arrive at a mathematical formulation of non-Euclidean spaces", comments SISSA neuroscientist Alessandro Treves, "but it's very likely that our brains could get there long before. In fact, it's likely that the brain of rodents gets there very naturally every day".
- Reconstructing Liberalism: Charles Mills' Unfinished Project Jack Turner: The political theory of Charles W. Mills seeks simultaneously to expose liberalism's complicity with white supremacy and to transform liberalism into a source of antiracist political critique. This article analyzes both Mills' critique of liberalism and his attempt to reconstruct it into a political philosophy capable of adequately addressing racial injustice. The author focuses on the (a) problematization of moral personhood, (b) theorization of white ignorance, and (c) conceptualization of white supremacy. Together these establish the need to integrate a new empirical axiom into liberal political theory: the axiom of the power of white supremacy in modernity, or the axiom of white power, for short. This axiom is analogous to James Madison's axiom of the encroaching spirit of power. Any liberal theorist that fails to take the encroaching spirit of power seriously meets the scorn of his peers. The same should be true for white power, Mills suggests. Charles concludes that Mill's reconstruction of liberalism is incomplete and urges him to develop a fully-fledged liberal theory of racial justice to complete it.
- Deep Boltzmann Machines with Fine Scalability Taichi Kiwaki: This study presents a layered Boltzmann machine (BM) whose generative performance scales with the number of layers. Application of deep BMs (DBMs) is limited due to its poor scalability where deep stacking of layers does not largely improve the performance. It is widely believed that DBMs have huge representation power, and its poor empirical scalability is mainly due to inefficiency of optimization algorithms. In this paper, we theoretically show that the representation power of DBMs is actually rather limited, and the inefficiency of the model can result in the poor scalability. Based on these observations, an alternative proposal is advanced - BM architecture, which is dubbed soft-deep BMs (sDBMs). It is theoretically shown that sDBMs possess much greater representation power than DBMs. Experiments demonstrate that this analysis is able to train sDBMs with up to 6 layers without pretraining, and sDBMs nicely compare state-of-the-art models on binarized MNIST and Caltech-101 silhouettes.
Weihrauch-completeness for layerwise computability Arno Pauly, George Davie: the authors introduce the notion of being Weihrauch-complete for layerwise computability and provide several natural examples related to complex oscillations, the law of the iterated logarithm and Birkhoff’s theorem. They also consider the hitting time operators, which share the Weihrauch degree of the former examples, but fail to be layerwise computable.
- Understanding Gauge James Owen Weatherall: the author considers two usages of the expression “gauge theory”. On one, a gauge theory is a theory with excess structure; on the other, a gauge theory is any theory appropriately related to classical electromagnetism. He makes precise one sense in which one formulation of electromagnetism, the paradigmatic gauge theory on both usages, may be understood to have excess structure, and then argue that gauge theories on the second usage, including Yang-Mills theory and general relativity, do not generally have excess structure in this sense.
- On the Threshold of Intractability Pal Grønas, Drange Markus, Sortland Dregi and Daniel Lokshtanov: the authors study the computational complexity of the graph modification problems Threshold Editing and Chain Editing, adding and deleting as few edges as possible to transform the input into a threshold (or chain) graph. In this article, they show that both problems are NP-hard, resolving a conjecture by Natanzon, Shamir, and Sharan. On the positive side, they show the problem admits a quadratic vertex kernel. Furthermore, they give a sub-exponential time parameterized algorithm solving Threshold Editing in 2O( √ k log k) + poly(n) time, making it one of relatively few natural problems in this complexity class on general graphs. These results are of broader interest to the field of social network analysis, where recent work of Brandes (ISAAC, 2014) posits that the minimum edit distance to a threshold graph gives a good measure of consistency for node centralities. Finally, the authors show that all their positive results extend to the related problem of Chain Editing, as well as the completion and deletion variants of both problems.
- How can we be moral when we are so irrational? Nils-Eric Sahlin and Johan Brännmark: Normative ethics usually presupposes background accounts of human agency, and although different ethical theorists might have different pictures of human agency in mind, there is still something like a standard account that most of mainstream normative ethics can be understood to rest on. Ethical theorists tend to have Rational Man, or at least some close relative to him, in mind when constructing normative theories. It is argued here that empirical findings raise doubts about the accuracy of this kind of account; human beings fall too far short of ideals of rationality for it to be meaningful to devise normative ideals within such a framework. Instead, it is suggested, normative ethics could be conducted more profitably if the idea of unifying all ethical concerns into one theoretical account is abandoned. This disunity of ethical theorizing would then match the disunited and heuristic-oriented nature of our agency
- Relax, Tensors Are Here: Dependencies in International Processes Shahryar Minhasa, Peter D. Hoff, Michael D. Warda: Previous models of international conflict have suffered two shortfalls. They tended not to embody dynamic changes, focusing rather on static slices of behavior over time. These models have also been empirically evaluated in ways that assumed the independence of each country, when in reality they are searching for the interdependence among all countries. Here, the authors illustrate a solution to these two hurdles and evaluate this new, dynamic, network based approach to the dependencies among the ebb and flow of daily international interactions using a newly developed, and openly available, database of events among nations.
- No Big Bang? Quantum equation predicts universe has no beginning the universe may have existed forever, according to a new model that applies quantum correction terms to complement Einstein's theory of general relativity. The model may also account for dark matter and dark energy, resolving multiple problems at once.
- Dynamical and Hamiltonian formulation of General Relativity Domenico Giulini: This is a substantially expanded version of a chapter-contribution to The Springer Handbook of Spacetime, edited by Abhay Ashtekar and Vesselin Petkov, published by Springer Verlag in 2014. It introduces the reader to the reformulation of Einstein’s field equations of General Relativity as a constrained evolutionary system of Hamiltonian type and discusses some of its uses, together with some technical and conceptual aspects. Attempts were made to keep the presentation self contained and accessible to first-year graduate students. This implies a certain degree of explicitness and occasional reviews of background material.
- Leibniz’s Theory of Time Soshichi Uchii: the author has developed an informational interpretation of Leibniz’s metaphysics and dynamics, but in this paper he will concentrate on his theory of time. According to Uchii interpretation, each monad is an incorporeal automaton programed by God, and likewise each organized group of monads is a cellular automaton (in von Neumann’s sense) governed by a single dominant monad (entelechy). The activities of these produce phenomena, which must be “coded appearances” of these activities; God determines this coding. A crucially important point here is that we have to distinguish the phenomena for a monad from its states (perceptions). Both are a kind of representation: a state represents the whole world of monads, and phenomena for a monad “result” from the activities of monads. But the coding for each must be different; R(W) for the first, Ph(W) for the second, where W is a state of the monadic world. The reason for this is that no monadic state is in space and time, but phenomena occur in space and time. Now, the basis of the phenomenal time must be in the timeless realm of monads. This basis is the order of state-transition of each monads. All the changes of these states are given at once by God, and these do not presuppose time. The coded appearances (which may well be different for different creatures) of this order occur in time (for any finite creatures), and its metric must depend on God’s coding for phenomena. For humans, in particular, this metric time is derived from spatial distance (metric space) via the laws of dynamics. Thus there may well be an interrelation between spatial and temporal metric. This means that the Leibnizian frame allows relativistic metric of space-time. Uchii shows this after outlining Leibniz’s scenario.
- Estimation of connectivity measures in gappy time series G. Papadopoulosa, D. Kugiumtzisb: A new method is proposed to compute connectivity measures on multivariate time series with gaps. Rather than removing or filling the gaps, the rows of the joint data matrix containing empty entries are removed and the calculations are done on the remainder matrix. The method, called measure adapted gap removal (MAGR), can be applied to any connectivity measure that uses a joint data matrix, such as cross correlation, cross mutual information and transfer entropy. MAGR is favorably compared using these three measures to a number of known gap-filling techniques, as well as the gap closure. The superiority of MAGR is illustrated on time series from synthetic systems and financial time series.
- The Quantum Fabric of Space-Time: How Quantum Pairs Stitch Space-Time - what quantum entanglement and gravity have to do with each other? New tools may reveal how quantum information builds the structure of space! Brian Swingle via Jennifer Ouellette from Quanta Magazine.
- On statistical indistinguishability of complete and incomplete discrete time market models Nikolai Dokuchaev: the author investigates the possibility of statistical evaluation of the market completeness for discrete time stock market models. It is known that the market completeness is not a robust property: small random deviations of the coefficients convert a complete market model into a incomplete one. The paper shows that market incompleteness is also non-robust. It is also shown that, for any incomplete market from a wide class of discrete time models, there exists a complete market model with arbitrarily close stock prices. This means that incomplete markets are indistinguishable from the complete markets in the terms of the market statistics.
- On the Structure, Covering, and Learning of Poisson Multinomial Distributions Constantinos Daskalakis, Gautam Kamath, and Christos Tzamos: An (n, k)-Poisson Multinomial Distribution (PMD) is the distribution of the sum of n independent random vectors supported on the set Bk = {e1, . . . , ek} of standard basis vectors in R k . The authors prove a structural characterization of these distributions, showing that, for all ε > 0, any (n, k)-Poisson multinomial random vector is ε-close, in total variation distance, to the sum of a discretized multidimensional Gaussian and an independent (poly(k/ε), k)-Poisson multinomial random vector. Their structural characterization extends the multi-dimensional CLT of [VV11], by simultaneously applying to all approximation requirements ε. In particular, it overcomes factors depending on log n and, importantly, the minimum eigenvalue of the PMD’s covariance matrix from the distance to a multidimensional Gaussian random variable. They use a structural characterization to obtain an ε-cover, in total variation distance, of the set of all (n, k)-PMDs, significantly improving the cover size of [DP08, DP15], and obtaining the same qualitative dependence of the cover size on n and ε as the k = 2 cover of [DP09, DP14]. The authors then further exploit this structure to show that (n, k)-PMDs can be learned to within ε in total variation distance from O˜ k(1/ε2 ) samples, which is near-optimal in terms of dependence on ε and independent of n. In particular, their result generalizes the single-dimensional result of [DDS12] for Poisson Binomials to arbitrary dimension.
- Fair is Fair: Social Preferences and Reciprocity in International Politics Behavioral economics has shown that many people often divert from classical assumptions about self-interested behavior: they have social preferences, concerned about issues of fairness and reciprocity. Social psychologists show that these preferences vary across actors, with some displaying more prosocial value orientations than others. Integrating a laboratory bargaining experiment with original archival research on Anglo-French and Franco-German diplomacy in the interwar period, it is shown how fairness and reciprocity matter in social interactions. Prosocials do not exploit their bargaining leverage to the degree that proselfs do, helping explain why some pairs of actors are better able to avoid bargaining failure than others. In the face of consistent egoism on the part of negotiating partners, however, prosocials engage in negative reciprocity, leading them to adopt the same behaviors as proselfs.
- High dimensional linear inverse modelling Fenwick C. Cooper: the author introduces and demonstrates two linear inverse modelling methods for systems of stochastic ODE’s with accuracy that is independent of the dimensionality (number of elements) of the state vector representing the system in question. Truncation of the state space is not required. Instead he relies on the principle that perturbations decay with distance or the fact that for many systems, the state of each data point is only determined at an instant by itself and its neighbours. He further shows that all necessary calculations, as well as numerical integration of the resulting linear stochastic system, require computational time and memory proportional to the dimensionality of the state vector.
- On Time in Quantum Physics Jeremy Butterfield: First, the author briefly reviews the different conceptions of time held by three rival interpretations of quantum theory: the collapse of the wave-packet, the pilot-wave interpretation, and the Everett interpretation (Section 2). Then he turns to a much less controversial task: to expound the recent understanding of the time-energy uncertainty principle, and indeed of uncertainty principles in general, that has been established by such authors as Busch, Hilgevoord and Uffink. Although this may at first seem a narrow topic, Jeremy points out connections to other conceptual topics about time in quantum theory: for example, the question under what circumstances there is a time operator.
- Cooperative Intergroup Mating Can Overcome Ethnocentrism in Diverse Populations Caitlin J. Mouri, Thomas R. Shultz: Ethnocentrism is a behavioral strategy seen on every scale of social interaction. Game-theory models demonstrate that evolution selects ethnocentrism because it boosts cooperation, which increases reproductive fitness. However, some believe that inter-ethnic unions have the potential to foster universal cooperation and overcome in-group biases in humans. Here, the authors use agent-based computer simulations to test this hypothesis. Cooperative intergroup mating does lend an advantage to a universal cooperation strategy when the cost/benefit ratio of cooperation is low and local population diversity is high.
- Researchers Discover How lncRNA Silences Entire Chromosome: Scientists at Caltech say they have discovered how long non-coding RNAs (lncRNAs) can regulate critical genes. By studying an lncRNA called Xist, the researchers identified how this RNA gathers a group of proteins and ultimately prevents women from having an extra functional X-chromosome, which is a condition in female embryos that leads to death in early development. These findings, note the scientists, mark the first time that researchers have uncovered the detailed mechanism of action for lncRNA genes.