The North American Crude Oil Model: A Framework for Multimodal Transport

[vc_row type=”in_container” full_screen_row_position=”middle” scene_position=”center” text_color=”dark” text_align=”left” top_padding=”30″ overlay_strength=”0.3″ shape_divider_position=”bottom”][vc_column column_padding=”no-extra-padding” column_padding_position=”all” background_color_opacity=”1″ background_hover_color_opacity=”1″ column_shadow=”none” column_border_radius=”none” width=”1/1″ tablet_text_alignment=”default” phone_text_alignment=”default” column_border_width=”none” column_border_style=”solid”][vc_column_text][nectar_dropcap color=”#3452ff”]T [/nectar_dropcap]ransferring and transporting energy via railway, pipeline, or over rivers and seas entails risk.  Safety concerns like train derailments and subsequent explosions, emissions and spills can create environmental disasters.  (See our previous post on North America’s crude-by-rail problem)

To tackle challenges of production, distribution, and consumption of energy in all its various forms, a variety of modeling approaches have been used, many beneficial but all with limitations, as the probability of emissions and spills remains difficult to determine. In order to decrease the risk of harmful environmental impacts, new energy transportation models that consider multimodal transport are needed.

Through the North American Crude Oil Model (NACOM), we present a new modeling approach to energy network analyses that can help decision-makers respond more specifically to environmental and safety concerns.  With the goal of considering a “what if” policy analysis under different market scenarios, we chose to use equilibrium problems expressed as complementarity problems.  NACOM extends earlier modeling techniques but provides more detailed analyses of emissions and safety risks for each energy conveyance mode.  It also incorporates data on cost and technology factors for each mode and can be applied on the basis of geographical region for finer distinctions.  By applying this model to the North American crude oil market (with transportation considered via railway, pipeline, and waterway), followed by a scenario analyses to explore avenues for reducing the public-safety and environmental impact of crude-by-rail transport (for example), we see a possible strategy for reducing risky rail transport in the near term.[/vc_column_text][/vc_column][/vc_row][vc_row type=”in_container” full_screen_row_position=”middle” scene_position=”center” text_color=”dark” text_align=”left” overlay_strength=”0.3″ shape_divider_position=”bottom” shape_type=””][vc_column column_padding=”no-extra-padding” column_padding_position=”all” background_color_opacity=”1″ background_hover_color_opacity=”1″ column_shadow=”none” column_border_radius=”none” width=”1/1″ tablet_text_alignment=”default” phone_text_alignment=”default” column_border_width=”none” column_border_style=”solid”][image_with_animation image_url=”6759″ alignment=”center” animation=”Fade In” img_link_large=”yes” border_radius=”none” box_shadow=”small_depth” max_width=”100%”][/vc_column][/vc_row][vc_row type=”in_container” full_screen_row_position=”middle” scene_position=”center” text_color=”dark” text_align=”left” overlay_strength=”0.3″ shape_divider_position=”bottom”][vc_column column_padding=”no-extra-padding” column_padding_position=”all” background_color_opacity=”1″ background_hover_color_opacity=”1″ column_shadow=”none” column_border_radius=”none” width=”1/1″ tablet_text_alignment=”default” phone_text_alignment=”default” column_border_width=”none” column_border_style=”solid”][vc_column_text]NACOM can be potentially coupled with climate assessment models for further impact-based decisions and policy analyses, and its multimodal features can be incorporated into existing energy-optimization-complementarity models. To date, no other model for North American crude with transfer mode specificity exists. As well, NACOM’s application and level of node disaggregation at the US state level in the North American market is a first in the academic literature.

Critical advances have been made in US energy policy, but crude oil remains a major component for forthcoming decades. The North American Crude Oil Model shows that a combination of export ban abolishment, pipeline investments, and rail caps provides the lowest crude-by-rail flows, in addition to the highest revenue. (All scenarios were similarly beneficial to the refining sector.) These outcomes suggest that integrated approaches are more likely to be successful in tackling the crude-by-rail problem and its attendant safety and environmental risks.

As Canada’s investments in production and transportation capacities continue to grow, alongside a developing US-Mexico partnership, the need for accurate analyses of transport costs and effects will only increase. These trends indicate a continued need to find optimal intersections of policy and market decisions, not only for multimodal crude oil networks, but also for current and future energy systems.[/vc_column_text][/vc_column][/vc_row][vc_row type=”in_container” full_screen_row_position=”middle” scene_position=”center” text_color=”dark” text_align=”left” overlay_strength=”0.3″ shape_divider_position=”bottom” shape_type=””][vc_column column_padding=”no-extra-padding” column_padding_position=”all” background_color_opacity=”1″ background_hover_color_opacity=”1″ column_shadow=”none” column_border_radius=”none” width=”1/1″ tablet_text_alignment=”default” phone_text_alignment=”default” column_border_width=”none” column_border_style=”solid”][image_with_animation image_url=”6760″ alignment=”center” animation=”Fade In” img_link_large=”yes” border_radius=”none” box_shadow=”small_depth” max_width=”100%”][/vc_column][/vc_row][vc_row type=”in_container” full_screen_row_position=”middle” equal_height=”yes” content_placement=”top” scene_position=”center” text_color=”dark” text_align=”left” class=”section-top-padding section-bottom-padding” id=”Book-Chapters” overlay_strength=”0.3″ shape_divider_position=”bottom” shape_type=””][vc_column column_padding=”no-extra-padding” column_padding_position=”all” background_color_opacity=”1″ background_hover_color_opacity=”1″ column_shadow=”none” column_border_radius=”none” width=”1/1″ tablet_text_alignment=”default” phone_text_alignment=”default” column_border_width=”none” column_border_style=”solid”][vc_column_text css_animation=”fadeInUp” el_class=”visual-generated-div”]

[/vc_column_text][/vc_column][/vc_row]

Inverse optimization is an area of study where the purpose is to infer the unknown parameters of an optimization problem when a set of observations is available on the previous decisions made in the settings of the problem. We develop a framework to effectively and efficiently infer the cost vector of a linear optimization problem based on multiple observations on the decisions made previously. 

We then test our models in the setting of a diet problem on a data-set obtained from NHANES; The data-set is accessible via the link bellow:

https://github.com/CSSEHealthcare/Dietary-Behavior-Dataset

A set of female individuals with the above criteria were considered. Further demographic and diet considerations (in order to select similar patients) led to selecting 11 different individuals’ one day of intake as the initial dataset for the model. In another setting, we only considered people that have consumed a reasonable amount of sodium and water. We consider these two nutrients as the main constraints in the DASH diet. 



In order to compare different potential data and their performance with the model, we used different data groups from the NHANES database. A group of middle-aged women with certain similar characteristics and a group of people with certain attributes in their diets. In the first group, we did not consider how the individual’s daily diet is reflecting on the constraints that the forward problem had and we relied on their own personal answer to questions regarding hypertension and also how prone they thought they were to type-2 diabetes. The result was a sparse set of variables and an inconclusive optimal solution in regards to the preferences. In the second group, we tried to obtain sub-optimal data. We prioritized the maximum sodium intake constraint and the water intake constraints as our main and most important constraints. 

We introduce a new approach that combines inverse optimization with conventional data analytics to recover the utility function of a human operator. In this approach, a set of final decisions of the operator is observed. For instance, the final treatment plans that a clinician chose for a patient or the dietary choices that a patient made to control their disease while also considering her own personal preferences. Based on these observations, we develop a new framework that uses inverse optimization to infer how the operator prioritized different trade-offs to arrive at her decision. 

We develop a new inverse optimization framework to infer the constraint parameters of a linear (forward) optimization based on multiple observations of the system. The goal is to find a feasible region for the forward problem such that all given observations become feasible and the preferred observations become optimal. We explore the theoretical properties of the model and develop computationally efficient equivalent models. We consider an array of functions to capture various desirable properties of the inferred feasible region. We apply our method to radiation therapy treatment planning—a complex optimization problem in itself—to understand the clinical guidelines that in practice are used by oncologists. These guidelines (constraints) will standardize the practice, increase planning efficiency and automation, and make high-quality personalized treatment plans for cancer patients possible.

Assume that a decision-maker’s uncertain behavior is observed. We develop a an inverse optimization framework to impute an objective function that is robust against misspecifications of the behavior. In our model, instead of considering multiple data points, we consider an uncertainty set that encapsulates all possible realizations of the input data. We adopt this idea from robust optimization, which has been widely used for solving optimization problems with uncertain parameters. By bringing robust and inverse optimization together, we propose a robust inverse linear optimization model for uncertain input observations. We aim to find a cost vector for the underlying forward problem such that the associated error is minimized for the worst-case realization of the uncertainty in the observed solutions. That is, such a cost vector is robust in the sense that it protects against the worst misspecification of a decision-maker’s behavior. 

As an example, we consider a diet recommendation problem. Suppose we want to learn the diet patterns and preferences of a specific person and make personalized recommendations in the future. The person’s choice, even if restricted by nutritional and budgetary constraints, may be inconsistent and vary over time. Assuming the person’s behavior can be represented by an uncertainty set, it is important to find a cost vector that renders the worst-case behavior within the uncertainty set as close to optimal as possible. Note that the cost vector can have a general meaning and may be interpreted differently depending on the application (e.g., monetary cost, utility function, or preferences). Under such a cost vector, any non-worst-case diet will thus have a smaller deviation from optimality.  

Radiation therapy is frequently used in diagnosing patients with cancer. Currently, the planning of such treatments is typically done manually which is time-consuming and prone to human error. The new advancements in computational powers and treating units now allow for designing treatment plans automatically.

To design a high-quality treatment, we select the beams sizes, positions, and shapes using optimization models and approximation algorithms. The optimization models are designed to deliver an appropriate amount of dose to the tumor volume while simultaneously avoiding sensitive healthy tissues. In this project, we work on finding the best beam positions for the radiation focal points for Gamma Knife® Perfexion™, using quadratic programming and algorithms such as grassfire and sphere-packing.

In radiation therapy with continuous dose delivery for Gamma Knife® Perfexion™, the dose is delivered while the radiation machine is in movement, as oppose to the conventional step-and-shoot approach which requires the unit to stop before any radiation is delivered. Continuous delivery can increase dose homogeneity and decrease treatment time. To design inverse plans, we first find a path inside the tumor volume, along which the radiation is delivered, and then find the beam durations and shapes using a mixed-integer programming optimization (MIP) model. The MIP model considers various machine-constraints as well as clinical guidelines and constraints.

Perioperative services are one of the vital components of hospitals and any disruption in their operations can leave a downstream effect in the rest of the hospital. A large body of evidence links inefficiencies in perioperative throughput with adverse clinical outcomes. A regular delay in the operating room (OR), may lead to overcrowding in post-surgical units, and consequently, more overnight patients in the hospital. Conversely, an underutilization of OR is not only a waste of an expensive and high-demand resource, but it also means that other services who have a demand are not able to utilize OR. This mismatch in demand and utilization may, in turn, lead to hold-ups in the OR and cause further downstream utilization. We investigate the utilization of operating rooms by each service. The null hypothesis of this work is that the predicted utilization of the OR, i.e., the current block schedule, matches completely with the actual utilization of the service. We test this hypothesis for different utilization definitions, including physical and operational utilization and reject the null hypothesis. We further analyze why a mismatch may exist and how to optimize the schedule to improve patient flow in the hospital.

Primary care is an important piece in the healthcare system that affects the downstream medical care of patients heavily. There are specific challenges in primary care as healthcare shifts from fee-for-service to population health management and medical home, focuses on cost savings and integrates quality measures. We consider the primary care unit at a large academic center that is facing similar challenges. In this work we focus on the imbalance in workload, which is a growing regulatory burden and directly concerns any staff in primary care. It can result in missed opportunities to deliver better patient care or providing a good work-environment for the physicians and the staff. We consider the primary care unit at the large academic center and focus on their challenge in balancing staff time with quality of care through a redesign of their system. We employ optimization models to reschedule providers’ sessions to improve the patient flow, and through that, a more balanced work-level for the support staff. 

This work was performed with the MIT/MGH Collaboration.

In many healthcare services, care is provided continuously, however, the care providers, e.g., doctors and nurses, work in shifts that are discrete. Hence, hand-offs between care providers is inevitable. Hand-offs are generally thought to effect patient care, although it is often hard to quantify the effects due to reverse causal effects between patients’ duration of stay and the number of hand-off events. We use a natural randomized control experiment, induced by physicians’ schedules, in teaching general medicine teams. We employ statistical tools to show that between the two randomly assigned groups of patients, a subset who experiences hand-off experience a different length of stay compared to the other group.

This work was performed with the MIT/MGH Collaboration.

Many outpatient facilities with expensive resources, such as infusion and imaging centers, experience surge in their patient arrival at times and are under-utilization at other times. This pattern results in patient safety concerns, patient and staff dissatisfaction, and limitation in growth, among others. Scheduling practices is found to be one of the main contributors to this problem.

We developed a real-time scheduling framework to address the problem, specifically for infusion clinics. The algorithm assumes no knowledge of future appointments and does not change past appointments. Operational constraints are taken into account, and the algorithm can offer multiple choices to patients.

We generalize this framework to a new scheduling model and analyze its performance through competitive ratio. The resource utilization of the real-time algorithm is compared with an optimal algorithm, which knows the entire future. It can be proved that the competitive ratio of the scheduling algorithm is between 3/2 and 5/3 of an optimal algorithm.

This work was performed with the MIT/MGH Collaboration.

Tracking COVID-19

We are tracking the COVID-19 spread in real-time on our interactive dashboard with data available for download. We are also modeling the spread of the virus. Preliminary study results are discussed on our blog.