With the popularization of the Internet, permeation of sensor networks, emergence of big data, increase in size of the information community, and interlinking and fusion of data and information throughout human society, physical space, and cyberspace, the information environment related to the current development of artificial intelligence (AI) has profoundly changed. AI faces important adjustments, and scientific foundations are confronted with new breakthroughs, as AI enters a new stage: AI 2.0. This paper briefly reviews the 60-year developmental history of AI, analyzes the external environment promoting the formation of AI 2.0 along with changes in goals, and describes both the beginning of the technology and the core idea behind AI 2.0 development. Furthermore, based on combined social demands and the information environment that exists in relation to Chinese development, suggestions on the development of AI 2.0 are given.
Intelligent manufacturing is a general concept that is under continuous development. It can be categorized into three basic paradigms: digital manufacturing, digital-networked manufacturing, and newgeneration intelligent manufacturing. New-generation intelligent manufacturing represents an indepth integration of new-generation artificial intelligence (AI) technology and advanced manufacturing technology. It runs through every link in the full life-cycle of design, production, product, and service. The concept also relates to the optimization and integration of corresponding systems; the continuous improvement of enterprises’ product quality, performance, and service levels; and reduction in resources consumption. New-generation intelligent manufacturing acts as the core driving force of the new industrial revolution and will continue to be the main pathway for the transformation and upgrading of the manufacturing industry in the decades to come. Human-cyber-physical systems (HCPSs) reveal the technological mechanisms of new-generation intelligent manufacturing and can effectively guide related theoretical research and engineering practice. Given the sequential development, cross interaction, and iterative upgrading characteristics of the three basic paradigms of intelligent manufacturing, a technology roadmap for ‘‘parallel promotion and integrated development” should be developed in order to drive forward the intelligent transformation of the manufacturing industry in China.
This paper summarizes the development of hydro-projects in China, blended with an international perspective. It expounds major technical progress toward ensuring the safe construction of high dams and river harnessing, and covers the theorization of uneven non-equilibrium sediment transport, inter-basin water diversion, giant hydro-generator units, pumped storage power stations, underground caverns, ecological protection, and so on.
This paper presents findings from an investigation of the large-scale construction solid waste (CSW) landslide that occurred at a landfill at Shenzhen, Guangdong, China, on December 20, 2015, and which killed 77 people and destroyed 33 houses. The landslide involved 2.73×106 m3 of CSW and affected an area about 1100?m in length and 630?m in maximum width, making it the largest landfill landslide in the world. The investigation of this disaster used a combination of unmanned aerial vehicle surveillance and multistage remote-sensing images to reveal the increasing volume of waste in the landfill and the shifting shape of the landfill slope for nearly two years before the landslide took place, beginning with the creation of the CSW landfill in March, 2014, that resulted in the uncertain conditions of the landfill’s boundaries and the unstable state of the hydrologic performance. As a result, applying conventional stability analysis methods used for natural landslides to this case would be difficult. In order to analyze this disaster, we took a multistage modeling technique to analyze the varied characteristics of the landfill slope’s structure at various stages of CSW dumping and used the non-steady?flow?theory to explain the groundwater seepage problem. The investigation showed that the landfill could be divided into two units based on the moisture in the land: ① a front uint, consisted of the landfill slope, which had low water content; and ② a rear unit, consisted of fresh waste, which had a high water content. This structure caused two effects—surface-water infiltration and consolidation seepage that triggered the landslide in the landfill. Surface-water infiltration induced a gradual increase in pore water pressure head, or piezometric head, in the front slope because the infiltrating position rose as the volume of waste placement increased. Consolidation seepage led to higher excess pore water pressures as the loading of waste increased. We also investigated the post-failure soil dynamics parameters of the landslide deposit using cone penetration, triaxial, and ring-shear tests in order to simulate the characteristics of a flowing slide with a long run-out due to the liquefaction effect. Finally, we conclude the paper with lessons from the tens of catastrophic landslides of municipal solid waste around the world and discuss how to better manage the geotechnical risks of urbanization.
The rise of big data has led to new demands for machine learning (ML) systems to learn complex models, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate representations, and decision functions) thereupon. In order to run ML algorithms at such scales, on a distributed cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required—and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that “big” ML systems can benefit greatly from ML-rooted statistical and algorithmic insights—and that ML researchers should therefore not shy away from such systems design—we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solutions. These principles and strategies span a continuum from application, to engineering, and to theoretical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guarantees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area that lies between ML and systems.
Starting with the Ertan arch dam (240 m high, 3300 MW) in 2000, China successfully built a total of seven ultra-high arch dams over 200 m tall by the end of 2014. Among these, the Jinping I (305 m), Xiaowan (294.5m), and Xiluodu (285.5 m) arch dams have reached the 300 m height level (i.e., near or over 300 m), making them the tallest arch dams in the world. The design and construction of these 300 m ultra-high arch dams posed significant challenges, due to high water pressures, high seismic design criteria, and complex geological conditions. The engineering team successfully tackled these challenges and made critical breakthroughs, especially in the area of safety control. In this paper, the author summarizes various key technological aspects involved in the design and construction of 300?m ultra-high arch dams, including the strength and stability of foundation rock, excavation of the dam base and surface treatment, dam shape optimization, safety design guidelines, seismic analysis and design, treatment of a complex foundation, concrete temperature control, and crack prevention. The experience gained from these projects should be valuable for future practitioners.
This study provides a definition for urban big data while exploring its features and applications of China’s city intelligence. The differences between city intelligence in China and the “smart city” concept in other countries are compared to highlight and contrast the unique definition and model for China’s city intelligence in this paper. Furthermore, this paper examines the role of urban big data in city intelligence by showing that it not only serves as the cornerstone of this trend as it also plays a core role in the diffusion of city intelligence technology and serves as an inexhaustible resource for the sustained development of city intelligence. This study also points out the challenges of shaping and developing of China’s urban big data. Considering the supporting and core role that urban big data plays in city intelligence, the study then expounds on the key points of urban big data, including infrastructure support, urban governance, public services, and economic and industrial development. Finally, this study points out that the utility of city intelligence as an ideal policy tool for advancing the goals of China’s urban development. In conclusion, it is imperative that China make full use of its unique advantages—including using the nation’s current state of development and resources, geographical advantages, and good human relations—in subjective and objective conditions to promote the development of city intelligence through the proper application of urban big data.
In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.
Additive manufacturing (AM) permits the fabrication of functionally optimized components with high geometrical complexity. The opportunity of using porous infill as an integrated part of the manufacturing process is an example of a unique AM feature. Automated design methods are still incapable of fully exploiting this design freedom. In this work, we show how the so-called coating approach to topology optimization provides a means for designing infill-based components that possess a strongly improved buckling load and, as a result, improved structural stability. The suggested approach thereby addresses an important inadequacy of the standard minimum compliance topology optimization approach, in which buckling is rarely accounted for; rather, a satisfactory buckling load is usually assured through a post-processing step that may lead to sub-optimal components. The present work compares the standard and coating approaches to topology optimization for the MBB beam benchmark case. The optimized structures are additively manufactured using a filamentary technique. This experimental study validates the numerical model used in the coating approach. Depending on the properties of the infill material, the buckling load may be more than four times higher than that of solid structures optimized under the same conditions.
To date, the Three Gorges Project is the largest hydro junction in the world. It is the key project for the integrated water resource management and development of the Changjiang River. The technology of the project, with its huge scale and comprehensive benefits, is extremely complicated, and the design difficulty is greater than that of any other hydro project in the world. A series of new design theories and methods have been proposed and applied in the design and research process. Many key technological problems regarding hydraulic structures have been overcome, such as a gravity dam with multi-layer large discharge orifices, a hydropower station of giant generating units, and a giant continual multi-step ship lock with a high water head.
Method development has always been and will continue to be a core driving force of microbiome science. In this perspective, we argue that in the next decade, method development in microbiome analysis will be driven by three key changes in both ways of thinking and technological platforms: ① a shift from dissecting microbiota structureby sequencing to tracking microbiota state, function, and intercellular interaction via imaging; ② a shift from interrogating a consortium or population of cells to probing individual cells; and ③ a shift from microbiome data analysis to microbiome data science. Some of the recent method-development efforts by Chinese microbiome scientists and their international collaborators that underlie these technological trends are highlighted here. It is our belief that the China Microbiome Initiative has the opportunity to deliver outstanding “Made-in-China” tools to the international research community, by building an ambitious, competitive, and collaborative program at the forefront of method development for microbiome science.
Bionics (the imitation or abstraction of the “inventions of nature) and, to an even greater extent, synthetic biology, will be as relevant to engineering development and industry as the silicon chip was over the last 50 years. Chemical industries already use so-called “white biotechnology” for new processes, new raw materials, and more sustainable use of resources. Synthetic biology is also used for the development of second-generation biofuels and for harvesting the sun's energy with the help of tailor-made microorganisms or biometrically designed catalysts. The market potential for bionics in medicine, engineering processes, and DNA storage is huge. “Moonshot” projects are already aggressively focusing on diseases and new materials, and a US-led competition is currently underway with the aim of creating a thousand new molecules. This article describes a timeline that starts with current projects and then moves on to code engineering projects and their implications, artificial DNA, signaling molecules, and biological circuitry. Beyond these projects, one of the next frontiers in bionics is the design of synthetic metabolisms that include artificial food chains and foods, and the bioengineering of raw materials; all of which will lead to new insights into biological principles. Bioengineering will be an innovation motor just as digitalization is today. This article discusses pertinent examples of bioengineering, particularly the use of alternative carbon-based biofuels and the techniques and perils of cell modification. Big data, analytics, and massive storage are important factors in this next frontier. Although synthetic biology will be as pervasive and transformative in the next 50 years as digitization and the Internet are today, its applications and impacts are still in nascent stages. This article provides a general taxonomy in which the development of bioengineering is classified in five stages (DNA analysis, bio-circuits, minimal genomes, protocells, xenobiology) from the familiar to the unknown, with implications for safety and security, industrial development, and the development of bioengineering and biotechnology as an interdisciplinary field. Ethical issues and the importance of a public debate about the consequences of bionics and synthetic biology are discussed.
Tissue engineering is a relatively new but rapidly developing field in the medical sciences. Noncoding RNAs (ncRNAs) are functional RNA molecules without a protein-coding function; they can regulate cellular behavior and change the biological milieu of the tissue. The application of ncRNAs in tissue engineering is starting to attract increasing attention as a means of resolving a large number of unmet healthcare needs, although ncRNA-based approaches have not yet entered clinical practice. In-depth research on the regulation and delivery of ncRNAs may improve their application in tissue engineering. The aim of this review is: to outline essential ncRNAs that are related to tissue engineering for the repair and regeneration of nerve, skin, liver, vascular system, and muscle tissue; to discuss their regulation and delivery; and to anticipate their potential therapeutic applications.
The first author proposed the concept of the cemented material dam (CMD) in 2009. This concept was aimed at building an environmentally friendly dam in a safer and more economical way for both the dam and the area downstream. The concept covers the cemented sand, gravel, and rock dam (CSGRD), the rockfill concrete (RFC) dam (or the cemented rockfill dam, CRD), and the cemented soil dam (CSD). This paper summarizes the concept and principles of the CMD based on studies and practices in projects around the world. It also introduces new developments in the CSGRD, CRD, and CSD.
Municipal wastewater treatment has long been known as a high-cost and energy-intensive process that destroys most of the energy-containing molecules by spending energy and that leaves little energy and few nutrients available for reuse. Over the past few years, some wastewater treatment plants have tried to revamp themselves as “resource factories,” enabled by new technologies and the upgrading of old technologies. In particular, there is an renewed interest in anaerobic biotechnologies, which can convert organic matter into usable energy and preserve nutrients for potential reuse. However, considerable technological and economic limitations still exist. Here, we provide an overview of recent advances in several cutting-edge anaerobic biotechnologies for wastewater treatment, including enhanced side-stream anaerobic sludge digestion, anaerobic membrane bioreactors, and microbial electrochemical systems, and discuss future challenges and opportunities for their applications. This review is intended to provide useful information to guide the future design and optimization of municipal wastewater treatment processes.
Given the significant requirements for transforming and promoting the process industry, we present the major limitations of current petrochemical enterprises, including limitations in decision-making, production operation, efficiency and security, information integration, and so forth. To promote a vision of the process industry with efficient, green, and smart production, modern information technology should be utilized throughout the entire optimization process for production, management, and marketing. To focus on smart equipment in manufacturing processes, as well as on the adaptive intelligent optimization of the manufacturing process, operating mode, and supply chain management, we put forward several key scientific problems in engineering in a demand-driven and application-oriented manner, namely: ① intelligent sensing and integration of all process information, including production and management information; ② collaborative decision-making in the supply chain, industry chain, and value chain, driven by knowledge; ③ cooperative control and optimization of plant-wide production processes via human-cyber-physical interaction; and ④life-cycle assessments for safety and environmental footprint monitoring, in addition to tracing analysis and risk control. In order to solve these limitations and core scientific problems, we further present fundamental theories and key technologies for smart and optimal manufacturing in the process industry. Although this paper discusses the process industry in China, the conclusions in this paper can be extended to the process industry around the world.
Hydropower is a clean, renewable, and environmentally friendly source of energy. It produces 3930?(TW•h)•a–1, and yields 16% of the world’s generated electricity and about 78% of renewable electricity generation (in 2015). Hydropower and climate change show a double relationship. On the one hand, as an important renewable energy resource, hydropower contributes significantly to the avoidance of greenhouse gas (GHG) emissions and to the mitigation of global warming. On the other hand, climate change is likely to alter river discharge, impacting water availability and hydropower generation. Hydropower contributes significantly to the reduction of GHG emissions and to energy supply security. Compared with conventional coal power plants, hydropower prevents the emission of about 3?GT CO2 per year, which represents about 9% of global annual CO2 emissions. Hydropower projects may also have an enabling role beyond the electricity sector, as a financing instrument for multipurpose reservoirs and as an adaptive measure regarding the impacts of climate change on water resources, because regulated basins with large reservoir capacities are more resilient to water resource changes, less vulnerable to climate change, and act as a storage buffer against climate change. At the global level, the overall impact of climate change on existing hydropower generation may be expected to be small, or even slightly positive. However, there is the possibility of substantial variations across regions and even within countries. In conclusion, the general verdict on hydropower is that it is a cheap and mature technology that contributes significantly to climate change mitigation, and could play an important role in the climate change adaptation of water resource availability. However, careful attention is necessary to mitigate the substantial environmental and social costs. Roughly more than a terawatt of capacity could be added in upcoming decades.
The aim of this article is to synthetically describe the research projects that a selection of Italian universities is undertaking in the context of big data. Far from being exhaustive, this article has the objective of offering a sample of distinct applications that address the issue of managing huge amounts of data in Italy, collected in relation to diverse domains.
The emerging prototype for a Smart City is one of an urban environment with a new generation of innovative services for transportation, energy distribution, healthcare, environmental monitoring, business, commerce, emergency response, and social activities. Enabling the technology for such a setting requires a viewpoint of Smart Cities as cyber-physical systems (CPSs) that include new software platforms and strict requirements for mobility, security, safety, privacy, and the processing of massive amounts of information. This paper identifies some key defining characteristics of a Smart City, discusses some lessons learned from viewing them as CPSs, and outlines some fundamental research issues that remain largely open.
This article provides in-depth insights into the necessary technologies for automated driving in future cities. State of science is reflected from different perspectives such as in-car computing and data management, road side infrastructure, and cloud solutions. Especially the challenges for the application of HD maps as core technology for automated driving are depicted in this article.