INVITED SPEAKER ABSTRACTS
Some optimisation problems in electrical power systems
Electrical power grids are a rich source of problems in optimisation and data analysis. This talk describes work on two such problems. In the first, we formulate a bi-level optimisation problem to identify possible vulnerabilities by finding the attack that causes maximal disruption. In the second, we describe multivariate logistic regression (MLR) and deep learning approaches for identifying outages in a grid from real-time sensor network data. We show that when these classifiers are trained to recognise the “signature” of outages under a variety of network conditions, they can identify outages correctly in the vast majority of cases. An extension of our approach allows identification of optimal sensor locations.
Optimisation in deregulated electricity markets: Australian experience
There are six kinds of optimisation processes that are conducted in deregulated electricity markets in Australia which are all conducted with generally weak relationships across the supply chain. These are described as generation planning, transmission planning, distribution planning and demand side (customer) planning, fuel supply planning and emission abatement planning. This is further complicated by certificate schemes for large and small scale renewable energy technologies. The talk will seek to describe the optimisation methods applied within each process, the current mechanisms applied for the integration of these planning processes which could potentially optimise the whole electricity supply chain, share anecdotes of how and where this integration process has broken down in the past, and speculate on how integrated planning across multiple entities could be made more effective having regard to future uncertainties.
Water supply optimisation
Recent droughts and shifting climate has highlighted the vulnerability of urban and rural water supply systems in Australia and around the world. There have been some major investments in water infrastructures in recent times. Major cities running out of water in modern world is unthinkable. Managing water security is a complex task. This involve the development of long-term strategy to meet future demands to deliver agreed levels of services in terms of supply reliability and security with new supply sources and demand management. The conventional approach used trial and error approach to minimise capital and operational cost to meet fixed objectives.
The approach of multi-criterion optimisation offers quantum improvement over conventional approach by offering Pareto set of solutions which express trade-offs between competing objectives. The recent advances in cloud based parallel processing computing capabilities enabled planners to explore multitude of solutions and trade-offs and present variety of supply portfolios for decision-makers. The presentation will cover advanced optimisation techniques developed and applied by WaterNSW.
Automatic logic-based benders decomposition with MiniZinc
Logic-based benders decomposition (LBBD) is a powerful hybrid optimisation technique that can combine the strong dual bounds of mixed integer programming (MIP) with the combinatorial search strengths of constraint programming (CP). A major drawback of LBBD is that it is a far more involved process to implement an LBBD solution to a problem than the “model-and-run” approach provided by both CP and MIP. We propose an automated approach that accepts an arbitrary MiniZinc model and solves it using LBBD with no additional intervention on the part of the modeller. The design of this approach also reveals an interesting duality between LBBD and large neighborhood search (LNS). We compare our implementation of this approach to CP and MIP solvers on 4 different problem classes where LBBD has been applied before.
Optimisation applications at the Australian Bureau of Statistics
The Australian Bureau of Statistics uses optimisation methods in a range of applications. As well as “textbook” OR problems such as job allocation, we apply optimisation to some less conventional problems. I will concentrate on two examples:
Secondary cell suppression occurs when we need to censor publication of some cells within a statistical table for privacy reasons. As well as suppressing the confidential cells (“primary suppression”) we often need to suppress additional cells in the table, so that readers cannot deduce the confidential data by “differencing” non-confidential cells.
However, suppressing cells reduces the value of the published data. Hence, we need a method that minimises the “cost” of suppression (e.g. by assigning each cell a value) while guaranteeing adequate protection for confidential cells. This can be formulated as a mixed integer problem, but computation requirements quickly become high, especially for tables in more than two dimensions.
Table balancing occurs in economics and demography, when we collect data from multiple sources that give different perspectives on the same phenomena: for instance, the sale of a car can be measured from both buyer’s and seller’s side. In such cases optimisation methods can help reconcile conflicting sources to produce a self-consistent and accurate estimate of what’s really going on.
Optimisation and games in transportation
In this talk I will present a survey of some applications of optimisation and game theory in equilibrium models for transportation systems. Starting from my early work on equilibrium flows for congested transit systems and some of the applications that these models have had along the years, I will move to more recent work on stochastic traffic equilibrium, risk-averse route choice, dynamic equilibrium, and the limiting behaviour of the Price-of-Anarchy for highly congestion networks.
Operations Research: for and with industry
In this talk, the speaker traces his applied operations research work for, and with industry. Through specific examples of work in industry sectors such as airports, tourism & travel, wine, coal, the police, and the postal services, this talk traces real applications of operations research. While these applied OR projects have resulted in benefits and outcomes to the companies involved, they have also resulted in a stream of publications. The speaker concludes that business impact and science impact are not necessarily orthogonal. What is needed is a research practitioner’s mindset.
Seeking multiple solutions: multi-modal optimisation using niching methods
Population or single-solution search-based optimisation algorithms (i.e. meta-heuristics) in their original forms are usually designed for locating a single global solution. Representative examples include among others evolutionary and swarm intelligence algorithms. These search algorithms typically converge to a single solution because of the global selection scheme used. Nevertheless, many real-world problems are “multi-modal” by nature, i.e., multiple satisfactory solutions exist. It may be desirable to locate many such satisfactory solutions, or even all of them, so that a decision maker can choose one that is most proper in his/her problem domain. Numerous techniques have been developed in the past for locating multiple optima (global and/or local). These techniques are commonly referred to as “niching” methods, e.g., crowding, fitness sharing, de-rating, restricted tournament selection, clearing, speciation, etc. In more recent times, niching methods have also been developed for meta-heuristic algorithms such as Particle Swarm Optimisation, Differential Evolution and Evolution Strategies.
In this talk I will introduce niching methods, including its historical background, the motivation of employing niching in EAs. I will describe a few classic niching methods, such as the fitness-sharing and crowding methods, then provide a review on several new niching methods that have been developed in meta-heuristics such as Particle Swarm Optimisation and Differential Evolution. Employing niching methods in real-world situations still face significant challenges, and this talk will discuss several such difficulties. In particular, niching in static and dynamic environments will be specifically addressed. Several examples of applying niching methods to solving real-world optimisation problems will be provided as well.
Evaluating the impact of maintenance on the throughput capacity of Australian coal chains
Coal is Australia’s second largest export and was worth $34.3 billion in 2015-2016. Two of Australia’s largest coal chains are the Hunter Valley Coal Chain and the Central Queensland Coal Network. The Hunter Valley Coal Chain is Australia’s largest export coal chain by volume, exporting 161 Mt in 2016 with a trade value of $15.2 billion. The Central Queensland Coal Network has the largest export coal rail network in Australia consisting of 2,700 km of track comprising four major coal systems and transported 226 Mt in 2015-2016.
Maintenance and renewal work plays a crucial role in the management of a coal chain. The planning of such work is complex and requires the balancing of three competing objectives: maximising the available throughput capacity, ensuring the reliability of the infrastructure assets, and minimising the associated costs. As export volumes grow, such planning becomes progressively more important and more challenging.
In this talk we describe the methodology underlying decision support tools that have been developed at The University of Newcastle to quickly evaluate the impact of planned maintenance and renewal work on the system throughput capacity of the Hunter Valley Coal Chain and the Central Queensland Coal Network. These tools highlight the value that relatively simple mathematical modelling and optimisation techniques can provide in the solution of practical problems arising in industry.
Optimisation for the livestock industry in northern Australia
The cattle industry in the northern states of Australia is currently facing important challenges, which include increasing demand, a strong reliance on live exports, and a changing climate. In total, cattle and beef production currently accounts for 17% of Australia’s $32.5 billion agricultural industry. CSIRO developed a suite of tools to analyse small and large scale investments in the supply chain for all northern livestock logistics. As part of this effort, we developed a strategic optimisation model to help the stakeholders determine the optimal location of cattle rest sites (known as spelling yards) and the optimal flows from breeding farms to ports, abattoirs and saleyards, subject to budget, site capacities and service requirements. The model also considers the operational guidelines that regulate maximum driving hours and water deprivation times. The model not only recommends spelling yard sites and shows that an additional abattoir can increase the value of the supply chain by over $715m, but also represents an important step towards rationalising this supply chain’s future operations by compiling a body of data that was previously unavailable for research and analysis.
Stochastic optimisation and game theory on energy markets
In this talk we develop a stochastic optimisation-game theory model representing an energy market, which includes the transmission network and a few number of agents with oligopolistic behaviour. We consider a general network with nonlinear externalities and nonlinear pricing rules. As a tool for the modelling and analysis we also use mechanism design theory.
Case study: South African health worker allocation
We present a problem in healthcare for under-served regions. Without healthcare workers, there can be no healthcare provided. When considering regions without as many workers available as would be ideal, the question of where to place the workers becomes essential in order to ensure the health of the population and also equity in the provision of healthcare between regions.
Based on a population in South Africa, we take a statistical regression model and use it to optimise worker allocation on the basis of expected number of healthcare visits. This problem can be approached from a MIP and/or a CP perspective and we will use it as a case study for program participants. We will provide data and allow the participants to discuss and form their own opinions on what the most appropriate objectives and constraints are for this problem.
Some Recent Advances in Polynomial Optimisation
Optimisation problems involving polynomial functions are of great importance in applied mathematics and engineering, and they are intrinsically hard problems. They arise in important engineering applications such as the sensor network localisation problem, and provide a rich and fruitful interaction between algebraic-geometric concepts and modern convex programming.
The talk will be divided into two parts. In the first part, I will describe the key results in this exciting area, highlighting the geometric and conceptual aspects as well as recent work on exact semi-definite program relaxation for polynomial optimisation problems. In the second part, I will explain how the semi-algebraic structure helps us to analyse the explicit convergence rate of some important and powerful algorithms such as alternating projection algorithm, proximal point algorithm and Douglas-Rachford algorithm. Applications to tensor computations and sparse optimisation problems will be discussed (if time is permitted).
Optimisation in data analysis: survey and recent developments
Optimisation methodology has proved to be essential to formulating and solving problems in data analysis, machine learning, and computational statistics. Such problems are characterised by fairly elementary objective functions but a very large amount of data. Algorithms need to take account of the statistical/learning context, the expense of computing function and derivative information, non-smoothness, and (increasingly) non-convexity. The talk will sketch canonical problem formulations, fundamental algorithmic techniques, and issues of current research focus.