Buckfest Abstracts

[Back to Buckfest main page]

Speaker:

Mike Ackerman

"Elections with Partially Ordered Preferences"
Mike Ackerman, Bellarmine University

The preferences of voters who care (or don't care) about more than one issue can be naturally modeled using partially ordered sets. In this joint work with Sul-Young Choi, Peter Coughlin, Eric Gottlieb, and Japheth Wood, we replace each voter's preference poset with its linear extensions to suggest a way to use any positional, pairwise, or other voting method that accepts totally ordered inputs to tally ballots. We also describe a second way to compute pairwise rankings from partially ordered preferences.


William H. E. Day

"Hazardous Pay for Phylogenetic Systematists: The Explosive Nature of Supertree Methods"
William H. E. Day

In phylogenetic systematics a problem of great practical and theoretical interest is to construct one or more large phylogenies, i.e., supertrees, from a given set of small phylogenies with overlapping sets of leaf labels. Closed sets identify sets of phylogenetic features that are implied by, but may not be in, a supertree method's input. On the other hand, exploding sets identify sets of features that are so contradictory as to convey no meaning. I will explore how set closures and explosions can affect the inference of rooted phylogenies by supertree methods.


George F Estabrook

"A New Application for the Algorithm of Estabrook and McMorris 1977"
George F Estabrook, Ecology and Evolution, The University of Michigan, Ann Arbor

In 1976 I spent a month in the laboratory of Don Boulter in Durham England, where they were developing then new technology to discover the sequence of kinds of amino acids comprising a protein. Homologous proteins from a diversity of flowering plant species were then compared to estimate the historical ancestor-descendant relation among these species and their common ancestors. While in Durham, I devised an algorithm to test the logical compatibility of pairs of hypotheses about this relation based on amino acid sequences. When I described that algorithm to Buck later that year, he proved that in general it would give a correct result. This result was published in 1977, J Math Biol 4:195. The algorithm has been hard at work for 40 years. This talk describes the algorithm and presents a new application for it.


Ophir Frieder

"McMorris – A Fond Farewell"
Ophir Frieder, Department of Computer Science, Illinois Institute of Technology

Initially common grounds are reviewed, followed by a description of ongoing interactions. The talk concludes with a summary of initial accomplishments and some potential future directions for an, as-of-yet, unfulfilled goal.

(I trace my interactions with Buck and lead into a discussion on an area within medical informatics.)


Pierre Hansen

"A survey on exact methods for minimum sum-of-squares clustering"
Pierre Hansen, GERAD and HEC Montréal
Daniel Aloise, Département de mathématiques et génie industriel, École Polytechnique de Montréal

Abstract: [PDF]


Ewa Kubicka

"Optimal Stopping Time on a Minority Color in a 2-Color Urn Scheme"
Ewa Kubicka and Grzegorz Kubicki, University of Louisville

Consider a complete bipartite graph G of odd order 2n+1 in which the size of one partite set is given by a binomial distribution with p=1/2. The vertices of G are observed one by one in a random order and the subgraph induced by them is revealed. We find the optimal stopping time maximizing the probability of stopping on a vertex from the smaller partite set. The probability of success is given and its asymptotic behavior is established.


Grzegorz Kubicki

"Secretary Problem for Twins"
Grzegorz Kubicki, University of Louisville, and Michal Morayne, Wroclaw University of Technology

Consider a poset P consisting of two chains of the same length; the only uncomparable elements are elements from the same level, referred as twins. The elements of P are observed one-by-one in some random permutation. At any time we observe the partial order induced by the elements that came up to that moment. We want to choose the presently observed element maximizing the probability that this element is one of the two maximal twins. We find the optimal stopping time and establish the asymptotic behavior of the probability of success.


François-Joseph Lapointe

"A Theoretical Framework for Phylogenetic Dance Composition"
François-Joseph Lapointe, Département de sciences biologiques, Université de Montréal, Canada, and Département de danse, Université du Québec à Montréal, Canada

The genetic code is composed of different nucleotides that make up our DNA, and the various DNA sequences uniquely characterize the genes encoded in our chromosomes. These genes are submitted to different evolutionary pressures, such that the DNA sequences of our most recent ancestors are quite different from ours. The science of phylogenetics aims at recovering the evolution of a gene over time by estimating the ancestor-descendent relationships among different species, or different populations. The complex processes of mutation and selection are the main forces driving the evolution of genes, and the complete history of a particular gene can only be studied using appropriate models of evolutionary change.

A choreographer also applies mutation and selection to generate dance pieces. Whereas the nucleotides are the words of the genetic vocabulary, the movements represent the vocabulary of dance. Consequently, a phylogenetic approach can be applied to transform movement sequences over time, to mutate the movement vocabulary, and to evolve choreographies. This process can be easily implemented using a phylogenetic algorithm, which takes as input an original movement sequence to produce a number of final sequences that are obtained through various types of choreographic operations. The history of movement sequences over time, the phylogenetic choreography, retraces the cumulative transformation of ancestral sequences into descendant sequences, based on a genetic model of evolutionary change.

In this paper, I will compare different models of DNA evolution, in the specific context of phylogenetic dance composition. I will propose an array of objective and subjective criteria to compare the choreographies generated with the various approaches. On the one hand, complexity, information and diversity measures will be used to compare the different movement sequences quantitatively. On the other hand, qualitative measures will be applied to assess the aesthetics of phylogenetic dance. Short choreographies generated with this novel approach will also be presented to illustrate the use of genetic models for dance composition.


Terry McKee

"Bucking conventional wisdom: How chordal graphs exemplify graph theory"
Terry McKee, Wright State University

Transcending the sentiment that chordal graphs are very useful, but very very special, I explore how general graph theory can be viewed as a reflection of chordal graph theory. This premise exploits the parallel between the minimal clique separator decomposition of arbitrary graphs into maximal prime subgraphs (mp-subgraphs) and the decomposition of chordal graphs into maximal complete subgraphs (maxcliques). In particular, selected results on maxcliques of chordal graphs generalize into results on mp-subgraphs of arbitrary graphs.


Boris Mirkin

"Possibilistic approach to partitioning"
Boris Mirkin, School of Computer Science, Birkbeck University of London, UK

The axiomatic consensus approach to social choice has been maintaining both of the perspectives, impossibility and possibility, in parallel; impossibility for rational choices and possibility for majority rules. Similar approach has been advocated in clustering by B. McMorris and his collaborators in the analysis of phylogenies and hierarchies; both impossibility of generally rational summarisations and possibilities for generalised majority rules. The author pursued a similar line regarding partitions. The real issue that does bother me is whether such an approach can bring any computationally feasible and data reliable methods for real data processing. A positive answer may come from cross-breeding two lines: (a) a general multi-parameter methodology for data processing and (b) axiomatic approach. An example of such a combination is given: the general Lance-Williams family of agglomeration algorithms and a general axiom do lead to computationally interesting clustering criteria. Some properties of these criteria are analysed.


H. Martyn Mulder

"The median procedure on graphs"
H. Martyn Mulder, Econometrisch Instituut, Erasmus Universiteit, Rotterdam

A consensus function is a model to describe a rational way to obtain consensus among a group of agents or clients. The input of the function consists of certain information about the agents, and the output concerns the issue, about which consensus should be reached. The rationality of the process is guaranteed by the fact that the consensus function satisfies certain "rational" rules or "consensus axioms". A typical question in consensus theory is: which set of axioms characterizes a given consensus function. Our main focus is the median function on graphs: the input is the location of the clients in the graph, the output is the set if medians, i.e. the vertices that minimize the average distance to the locations of the clients. We survey old and new results. Median graphs play a major role.


Robert C. Powers

"Buckfest 2008: Aggregation Rules that satisfy Anonymity and Neutrality"
Jonathan Perry and Robert C. Powers, Department of Mathematics, University of Louisville

Abstract: [PDF], or:
In the case of two alternatives, elementary counting techniques are used to determine the number of aggregation rules that satisfy anonymity and neutrality. We also introduce a restrictive version of anonymity, which we call partial anonymity, and use our main results to calculate the number of aggregation functions that satisfy partial anonymity and neutrality.


Ed Reingold

"Lower Bounds for Cops and Robber Pursuit"
Ed Reingold, Department of Computer Science, Illinois Institute of Technology

We prove that the robber can evade (that is, stay at least unit distance from) at least ⌊n / 5.889⌋ cops in an n × n continuous square region.


Fred Roberts

"Algorithms for Port of Entry Inspection"
Fred Roberts, Rutgers University

As a stream of containers arrives at a port, a decision maker has to decide how to inspect them, which to subject to further inspection, which to allow to pass through with only minimal levels of inspection, etc. We look at this as a complex sequential decision making problem. Sequential decision problems arise in many areas, including communication networks, manufacturing, artificial intelligence and computer science, and medicine. The problem we investigate is to find algorithms for sequential diagnosis that minimize the total "cost" of the inspection procedure, including the cost of false positives and false negatives. To make the problem precise, we imagine a stream of containers arriving at the port with the goal of classifying each of them into one of several categories. There are several possible tests that can be performed and an inspection scheme specifies which test to perform next based on outcomes of previous tests. Stroud and Saeger at Los Alamos have formulated this problem, in an important special case, as a problem of finding an optimal binary decision tree for an appropriate binary decision function. We describe the basic idea of the Stroud-Saeger method and the results of new algorithms that improve significantly on the size of the decision problems for which it is applicable.


[Back to Buckfest main page]