The scheduling of gasoline-blending operations is an important problem in the oil refining industry. This problem not only exhibits the combinatorial nature that is intrinsic to scheduling problems, but also non-convex nonlinear behavior, due to the blending of various materials with different quality properties. In this work, a global optimization algorithm is proposed to solve a previously published continuous-time mixed-integer nonlinear scheduling model for gasoline blending. The model includes blend recipe optimization, the distribution problem, and several important operational features and constraints. The algorithm employs piecewise McCormick relaxation (PMCR) and normalized multiparametric disaggregation technique (NMDT) to compute estimates of the global optimum. These techniques partition the domain of one of the variables in a bilinear term and generate convex relaxations for each partition. By increasing the number of partitions and reducing the domain of the variables, the algorithm is able to refine the estimates of the global solution. The algorithm is compared to two commercial global solvers and two heuristic methods by solving four examples from the literature. Results show that the proposed global optimization algorithm performs on par with commercial solvers but is not as fast as heuristic approaches.
As the demand for energy continues to increase, shale gas, as an unconventional source of methane (CH4), shows great potential for commercialization. However, due to the ultra-low permeability of shale gas reservoirs, special procedures such as horizontal drilling, hydraulic fracturing, periodic well shut-in, and carbon dioxide (CO2) injection may be required in order to boost gas production, maximize economic benefits, and ensure safe and environmentally sound operation. Although intensive research is devoted to this emerging technology, many researchers have studied shale gas design and operational decisions only in isolation. In fact, these decisions are highly interactive and should be considered simultaneously. Therefore, the research question addressed in this study includes interactions between design and operational decisions. In this paper, we first establish a full-physics model for a shale gas reservoir. Next, we conduct a sensitivity analysis of important design and operational decisions such as well length, well arrangement, number of fractures, fracture distance, CO2 injection rate, and shut-in scheduling in order to gain in-depth insights into the complex behavior of shale gas networks. The results suggest that the case with the highest shale gas production may not necessarily be the most profitable design; and that drilling, fracturing, and CO2 injection have great impacts on the economic viability of this technology. In particular, due to the high costs, enhanced gas recovery (EGR) using CO2 does not appear to be commercially competitive, unless tax abatements or subsidies are available for CO2 sequestration. It was also found that the interactions between design and operational decisions are significant and that these decisions should be optimized simultaneously.
In this paper, a reinforcement learning (RL)-based Sarsa temporal-difference (TD) algorithm is applied to search for a unified bidding and operation strategy for a coal-fired power plant with monoethanolamine (MEA)-based post-combustion carbon capture under different carbon dioxide (CO2) allowance market conditions. The objective of the decision maker for the power plant is to maximize the discounted cumulative profit during the power plant lifetime. Two constraints are considered for the objective formulation. Firstly, the tradeoff between the energy-intensive carbon capture and the electricity generation should be made under presumed fixed fuel consumption. Secondly, the CO2 allowances purchased from the CO2 allowance market should be approximately equal to the quantity of CO2 emission from power generation. Three case studies are demonstrated thereafter. In the first case, we show the convergence of the Sarsa TD algorithm and find a deterministic optimal bidding and operation strategy. In the second case, compared with the independently designed operation and bidding strategies discussed in most of the relevant literature, the Sarsa TD-based unified bidding and operation strategy with time-varying flexible market-oriented CO2 capture levels is demonstrated to help the power plant decision maker gain a higher discounted cumulative profit. In the third case, a competitor operating another power plant identical to the preceding plant is considered under the same CO2 allowance market. The competitor also has carbon capture facilities but applies a different strategy to earn profits. The discounted cumulative profits of the two power plants are then compared, thus exhibiting the competitiveness of the power plant that is using the unified bidding and operation strategy explored by the Sarsa TD algorithm.
Smart manufacturing will transform the oil refining and petrochemical sector into a connected, information-driven environment. Using real-time and high-value support systems, smart manufacturing enables a coordinated and performance-oriented manufacturing enterprise that responds quickly to customer demands and minimizes energy and material usage, while radically improving sustainability, productivity, innovation, and economic competitiveness. In this paper, several examples of the application of so-called “smart manufacturing” for the petrochemical sector are demonstrated, such as the fault detection of a catalytic cracking unit driven by big data, advanced optimization for the planning and scheduling of oil refinery sites, and more. Key scientific factors and challenges for the further smart manufacturing of chemical and petrochemical processes are identified.
In the globalized market environment, increasingly significant economic and environmental factors within complex industrial plants impose importance on the optimization of global production indices; such optimization includes improvements in production efficiency, product quality, and yield, along with reductions of energy and resource usage. This paper briefly overviews recent progress in data-driven hybrid intelligence optimization methods and technologies in improving the performance of global production indices in mineral processing. First, we provide the problem description. Next, we summarize recent progress in data-based optimization for mineral processing plants. This optimization consists of four layers: optimization of the target values for monthly global production indices, optimization of the target values for daily global production indices, optimization of the target values for operational indices, and automation systems for unit processes. We briefly overview recent progress in each of the different layers. Finally, we point out opportunities for future works in data-based optimization for mineral processing plants.
This work uses a mathematical optimization approach to analyze and compare facilities that either capture carbon dioxide (CO2) artificially or use naturally captured CO2 in the form of lignocellulosic biomass toward the production of the same product, dimethyl ether (DME). In nature, plants capture CO2 via photosynthesis in order to grow. The design of the first process discussed here is based on a superstructure optimization approach in order to select technologies that transform lignocellulosic biomass into DME. Biomass is gasified; next, the raw syngas must be purified using reforming, scrubbing, and carbon capture technologies before it can be used to directly produce DME. Alternatively, CO2 can be captured and used to produce DME via hydrogenation. Hydrogen (H2) is produced by splitting water using solar energy. Facilities based on both photovoltaic (PV) solar or concentrated solar power (CSP) technologies have been designed; their monthly operation, which is based on solar availability, is determined using a multi-period approach. The current level of technological development gives biomass an advantage as a carbon capture technology, since both water consumption and economic parameters are in its favor. However, due to the area required for growing biomass and the total amount of water consumed (if plant growing is also accounted for), the decision to use biomass is not a straightforward one.
Most olefins (e.g., ethylene and propylene) will continue to be produced through steam cracking (SC) of hydrocarbons in the coming decade. In an uncertain commodity market, the chemical industry is investing very little in alternative technologies and feedstocks because of their current lack of economic viability, despite decreasing crude oil reserves and the recognition of global warming. In this perspective, some of the most promising alternatives are compared with the conventional SC process, and the major bottlenecks of each of the competing processes are highlighted. These technologies emerge especially from the abundance of cheap propane, ethane, and methane from shale gas and stranded gas. From an economic point of view, methane is an interesting starting material, if chemicals can be produced from it. The huge availability of crude oil and the expected substantial decline in the demand for fuels imply that the future for proven technologies such as Fischer-Tropsch synthesis (FTS) or methanol to gasoline is not bright. The abundance of cheap ethane and the large availability of crude oil, on the other hand, have caused the SC industry to shift to these two extremes, making room for the on-purpose production of light olefins, such as by the catalytic dehydrogenation of propane.
Over time, the performance of processes may deviate from the initial design due to process variations and uncertainties, making it necessary to develop systematic methods for online optimality assessment based on routine operating process data. Some processes have multiple operating modes caused by the set point change of the critical process variables to achieve different product specifications. On the other hand, the operating region in each operating mode can alter, due to uncertainties. In this paper, we will establish an optimality assessment framework for processes that typically have multi-mode, multi-region operations, as well as transitions between different modes. The kernel density approach for mode detection is adopted and improved for operating mode detection. For online mode detection, the model-based clustering discriminant analysis (MclustDA) approach is incorporated with some a priori knowledge of the system. In addition, multi-modal behavior of steady-state modes is tackled utilizing the mixture probabilistic principal component regression (MPPCR) method, and dynamic principal component regression (DPCR) is used to investigate transitions between different modes. Moreover, a probabilistic causality detection method based on the sequential forward floating search (SFFS) method is introduced for diagnosing poor or non-optimum behavior. Finally, the proposed method is tested on the Tennessee Eastman (TE) benchmark simulation process in order to evaluate its performance.
The Paris Agreement proposed to keep the increase in global average temperature to well below 2?°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5?°C above pre-industrial levels. It was thus the first international treaty to endow the 2?°C global temperature target with legal effect. The qualitative expression of the ultimate objective in Article 2 of the United Nations Framework Convention on Climate Change (UNFCCC) has now evolved into the numerical temperature rise target in Article 2 of the Paris Agreement. Starting with the Second Assessment Report (SAR) of the Intergovernmental Panel on Climate Change (IPCC), an important task for subsequent assessments has been to provide scientific information to help determine the quantified long-term goal for UNFCCC negotiation. However, due to involvement in the value judgment within the scope of non-scientific assessment, the IPCC has never scientifically affirmed the unacceptable extent of global temperature rise. The setting of the long-term goal for addressing climate change has been a long process, and the 2?°C global temperature target is the political consensus on the basis of scientific assessment. This article analyzes the evolution of the long-term global goal for addressing climate change and its impact on scientific assessment, negotiation processes, and global low-carbon development, from aspects of the origin of the target, the series of assessments carried out by the IPCC focusing on Article 2 of the UNFCCC, and the promotion of the global temperature goal at the political level.
The challenges posed by smart manufacturing for the process industries and for process systems engineering (PSE) researchers are discussed in this article. Much progress has been made in achieving plant- and site-wide optimization, but benchmarking would give greater confidence. Technical challenges confronting process systems engineers in developing enabling tools and techniques are discussed regarding flexibility and uncertainty, responsiveness and agility, robustness and security, the prediction of mixture properties and function, and new modeling and mathematics paradigms. Exploiting intelligence from big data to drive agility will require tackling new challenges, such as how to ensure the consistency and confidentiality of data through long and complex supply chains. Modeling challenges also exist, and involve ensuring that all key aspects are properly modeled, particularly where health, safety, and environmental concerns require accurate predictions of small but critical amounts at specific locations. Environmental concerns will require us to keep a closer track on all molecular species so that they are optimally used to create sustainable solutions. Disruptive business models may result, particularly from new personalized products, but that is difficult to predict.