34,00 € *

ggf. zzgl. Versand

ggf. zzgl. Versand

High Quality Content by WIKIPEDIA articles! In computational complexity theory, a branch of computer science, Schaefer's theorem states necessary and sufficient conditions under which a finite set S of Boolean relations yields polynomial-time or NP-complete problems when the relations of S are used to constrain some of the propositional variables. More precisely, Schaefer defines a decision problem which he calls the Generalized Satisfiability problem for S (denoted SAT(S)). The problem is to determine whether the given formula is satisfiable, in other words if the variables can be assigned values such that they satisfy all the constraints. Special cases of SAT(S) include the variants of Boolean satisfiability problem and the problem can also be viewed as a constraint satisfaction problem over the Boolean domain.

Anbieter: Dodax

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

36,64 € *

ggf. zzgl. Versand

ggf. zzgl. Versand

This thesis deals with a general modeling framework for large-scale biological systems which is, on the one hand, applied to various practical instances, and on the other hand, strictly formalized and mathematically analyzed with respect to its complexity and structure. For the biological application initially an overview of existing analytic methods for biological systems is presented, and the proposed modeling framework is classified in this context. The framework is based on logical implication formulas. It allows for the verification of a biological model, the prediction of its response to prescribed stimuli, as well as the identification of possible intervention strategies for diseases or failure modes. This basic model is afterwards extended into two directions: First, timing information of reactions in the biological unit are incorporated. This generalization additionally enables to detect possible unknown timing information or inconsistencies that arise due to modeling errors. Besides this, it provides a method to consistently integrate the logical models of related biological units into one model. Second, the purely binary basic framework is enhanced by including a fine discretization of a biological component's activity level. This permits to express different effects depending on different levels of activity of one component, and therefore the predictions of the model become more sophisticated. On the mathematical side the logical framework and its extensions are derived and formalized. The basic model evolves to a special type of satisfiability problem (SAT) whose complexity is classified to be generally hard but mathematically easy subclasses are identified. The correspondence between SAT and integer programming is exploited and the underlying polyhedra are analyzed. Interestingly, the SAT problem allows for a wider class of polynomially solvable problems than its integer programming equivalent. Nevertheless, the computational results provided proof that the integer programming approach is computationally feasible. The basic SAT problem can additionally be translated into a bipartite digraph for which algorithms are adapted, and their practical use is discussed. Furthermore, for a special class of biological units a duality framework based on linear programming duality is derived, which completes the theory of such biological units. The dynamic extension of the basic framework yields a related SAT problem that contains the original one as a special case, and is thus hard to solve as well. The focus for this extension is on the analysis of maximally feasible and minimally infeasible solutions of the extended SAT problem. Therefore, it is necessary to optimize over the set of solutions of the SAT problem which suggests to employ the equivalent integer programming approach. To enumerate all maximally feasible and minimally infeasible solutions the Joint Generation algorithm is utilized. To this end, a monotone reformulation of the extended SAT problem is derived that preserves the maximally feasible and minimally infeasible solutions, and at the same time significantly reduces the size of the description. In certain very restrictive cases the resulting integer optimization problems are even computationally tractable. Finally, the minimally infeasible solutions are completely characterized by means of graph structures in the original digraph, and an alternative method for computing all minimally infeasible solutions via polyhedral projection is obtained. The discrete extension of the logical framework leads to a generalization of the SAT problem, the so called interval satisfiability problem. In this setting the variables are integer valued and associated intervals provide the set of values for which the expression becomes TRUE. To computationally determine feasible solutions, this problem is transformed to a system of polynomials which can be checked for feasibility by means of Hilbert's Nullstellensatz. Moreover, the general interval satisfiability problem is analyzed with respect to complexity and satisfiability. Concerning the computational complexity, it is shown to be generally hard even if assuming certain restrictions for the formulas. Concerning the satisfiability behavior the well known threshold phenomenon of classical random SAT, which has been observed for interval satisfiability, is examined and lower bounds on specific thresholds are identified.

Anbieter: Dodax

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

34,00 € *

ggf. zzgl. Versand

ggf. zzgl. Versand

High Quality Content by WIKIPEDIA articles! The Valiant Vazirani theorem is an important result in computational complexity theory. It was proven by Leslie Valiant and Vijay Vazirani in their paper titled "NP is as easy as detecting unique solutions" published in 1986. The theorem states that if there is a polynomial time algorithm for UNIQUE-SAT, then NP=RP. The theorem implies that even if the number of satisfying assignments is very small, SAT (which is an NP-complete problem) still remains a hard problem. UNIQUE-SAT is a promise problem that decides whether a given Boolean formula is unsatisfiable or has exactly one satisfying assignment. In the first case a UNIQUE-SAT algorithm would reject, and in the second it would accept the formula. If the formula has more than one satisfying assignment then the behavior of the UNIQUE-SAT algorithm does not matter.

Anbieter: Dodax

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

34,00 € *

ggf. zzgl. Versand

ggf. zzgl. Versand

High Quality Content by WIKIPEDIA articles! The language TQBF is a formal language in computer science that contains True Quantified Boolean Formulas. A fully quantified boolean formula is a formula in first-order logic where every variable is quantified (or bound), using either existential or universal quantifiers, at the beginning of the sentence. Any such formula is always either true or false (since there are no free variables). If such a formula evaluates to true, then that formula is in the language TQBF. It is also known as QSAT (Quantified SAT). In computational complexity theory, the quantified Boolean formula problem (QBF) is a generalization of the Boolean satisfiability problem in which both existential quantifiers and universal quantifiers can be applied to each variable.

Anbieter: Dodax

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

101,64 € *

ggf. zzgl. Versand

ggf. zzgl. Versand

Although they are believed to be unsolvable in general, tractability results suggest that some practical NP-hard problems can be efficiently solved. Combinatorial search algorithms are designed to efficiently explore the usually large solution space of these instances by reducing the search space to feasible regions and using heuristics to efficiently explore these regions. Various mathematical formalisms may be used to express and tackle combinatorial problems, among them the constraint satisfaction problem (CSP) and the propositional satisfiability problem (SAT). These algorithms, or constraint solvers, apply search space reduction through inference techniques, use activity-based heuristics to guide exploration, diversify the searches through frequent restarts, and often learn from their mistakes.In this book the author focuses on knowledge sharing in combinatorial search, the capacity to generate and exploit meaningful information, such as redundant constraints, heuristic hints, and performance measures, during search, which can dramatically improve the performance of a constraint solver. Information can be shared between multiple constraint solvers simultaneously working on the same instance, or information can help achieve good performance while solving a large set of related instances. In the first case, information sharing has to be performed at the expense of the underlying search effort, since a solver has to stop its main effort to prepare and communicate the information to other solvers, on the other hand, not sharing information can incur a cost for the whole system, with solvers potentially exploring unfeasible spaces discovered by other solvers. In the second case, sharing performance measures can be done with little overhead, and the goal is to be able to tune a constraint solver in relation to the characteristics of a new instance - this corresponds to the selection of the most suitable algorithm for solving a given instance.The book is suitable for researchers, practitioners, and graduate students working in the areas of optimization, search, constraints, and computational complexity.

Anbieter: Dodax

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

23,94 € *

ggf. zzgl. Versand

ggf. zzgl. Versand

If you know how to program with Python and also know a little about probability, you’re ready to tackle Bayesian statistics. With this book, you'll learn how to solve statistical problems with Python code instead of mathematical notation, and use discrete probability distributions instead of continuous mathematics. Once you get the math out of the way, the Bayesian fundamentals will become clearer, and you’ll begin to apply these techniques to real-world problems.Bayesian statistical methods are becoming more common and more important, but not many resources are available to help beginners. Based on undergraduate classes taught by author Allen Downey, this book’s computational approach helps you get a solid start.* Use your existing programming skills to learn and understand Bayesian statistics* Work with problems involving estimation, prediction, decision analysis, evidence, and hypothesis testing* Get started with simple examples, using coins, M&Ms, Dungeons & Dragons dice, paintball, and hockey* Learn computational methods for solving real-world problems, such as interpreting SAT scores, simulating kidney tumors, and modeling the human microbiome.

Anbieter: Dodax

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

107,00 CHF *

ggf. zzgl. Versand

ggf. zzgl. Versand

Although they are believed to be unsolvable in general, tractability results suggest that some practical NP-hard problems can be efficiently solved. Combinatorial search algorithms are designed to efficiently explore the usually large solution space of these instances by reducing the search space to feasible regions and using heuristics to efficiently explore these regions. Various mathematical formalisms may be used to express and tackle combinatorial problems, among them the constraint satisfaction problem (CSP) and the propositional satisfiability problem (SAT). These algorithms, or constraint solvers, apply search space reduction through inference techniques, use activity-based heuristics to guide exploration, diversify the searches through frequent restarts, and often learn from their mistakes. In this book the author focuses on knowledge sharing in combinatorial search, the capacity to generate and exploit meaningful information, such as redundant constraints, heuristic hints, and performance measures, during search, which can dramatically improve the performance of a constraint solver. Information can be shared between multiple constraint solvers simultaneously working on the same instance, or information can help achieve good performance while solving a large set of related instances. In the first case, information sharing has to be performed at the expense of the underlying search effort, since a solver has to stop its main effort to prepare and communicate the information to other solvers; on the other hand, not sharing information can incur a cost for the whole system, with solvers potentially exploring unfeasible spaces discovered by other solvers. In the second case, sharing performance measures can be done with little overhead, and the goal is to be able to tune a constraint solver in relation to the characteristics of a new instance &#8211; this corresponds to the selection of the most suitable algorithm for solving a given instance. The book is suitable for researchers, practitioners, and graduate students working in the areas of optimization, search, constraints, and computational complexity.

Anbieter: Orell Fuessli CH

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

135,00 CHF *

ggf. zzgl. Versand

ggf. zzgl. Versand

The NP-completeness of SAT is a celebrated example of the power of bounded-depth computation: the core of the argument is a depth reduction establishing that any small nondeterministic circuit - an arbitrary NP computation on an arbitrary input - can be simulated by a small non deterministic circuit of depth 2 with unbounded fan-in - a SAT instance. Many other examples permeate theoretical computer science. On the Power of Small-Depth Computation discusses a selected subset of them, and includes a few unpublished proofs. On the Power of Small-Depth Computation starts with a unified treatment of the challenge of exhibiting explicit functions that have small correlation with low-degree polynomials over . It goes on to describe an unpublished proof that small bounded-depth circuits (AC°) have exponentially small correlation with the parity function. The proof is due to Adam Klivans and Salil Vadhan; it builds upon and simplifies previous ones. Thereafter, it looks at a depth-reduction result by Leslie Valiant, the proof of which has not before appeared in full. It concludes by presenting the result by Benny Applebaum, Yuval Ishai, and Eyal Kushilevitz that shows that, under standard complexity theoretic assumptions, many cryptographic primitives can be implemented in very restricted computational models. On the Power of Small-Depth Computation is an ideal primer for anyone with an interest in computational complexity, random structures and algorithms and theoretical computer science generally.

Anbieter: Orell Fuessli CH

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot

212,90 CHF *

ggf. zzgl. Versand

ggf. zzgl. Versand

Hybrid Optimization focuses on the application of artificial intelligence and operations research techniques to constraint programming for solving combinatorial optimization problems. This book covers the most relevant topics investigated in the last ten years by leading experts in the field, and speculates about future directions for research. This book includes contributions by experts from different but related areas of research including constraint programming, decision theory, operations research, SAT, artificial intelligence, as well as others. These diverse perspectives are actively combined and contrasted in order to evaluate their relative advantages. This volume presents techniques for hybrid modeling, integrated solving strategies including global constraints, decomposition techniques, use of relaxations, and search strategies including tree search local search and metaheuristics. Various applications of the techniques presented as well as supplementary computational tools are also discussed.

Anbieter: Orell Fuessli CH

Stand: 17.02.2020 Zum Angebot

Stand: 17.02.2020 Zum Angebot