Comments: No Comments
5 Questions to Implement Predictive Analytics
How much data does your company generate associated with your business operations? What do you do with that data? Are you using it to inform your business decisions and, potentially, to avoid future risks?
Predictive Analytics and Safety
Predictive analytics is a field that focuses on analyzing large data sets to identify patterns and predict outcomes to help guide decision-making. Many leading companies currently use predictive analytics to identify marketing and sales opportunities. However, the potential benefits of analyzing data in other areas—particularly safety—are considerable.
For example, safety and incident data can be used to predict when and where incidents are likely to occur. Appropriate data analysis strategies can also identify the key factors that contribute to incident risk, thereby allowing companies to proactively address those factors to avoid future incidents.
Key Questions
Many companies already have large amounts of data they are gathering. The key is figuring out how to use that data intelligently to guide future business decision-making. Here are five key questions to guide you in integrating predictive analytics into your operations.
-
What are you interested in predicting? What risks are you trying to mitigate?
Defining your desired result not only helps to focus the project, it narrows brainstorming of risk factors and data sources. This can—and should—be continually referenced throughout the project to ensure your work is designed appropriately to meet the defined objectives.
-
How do you plan to use the predictions to improve operations? What is your goal of implementing a predictive analytics project?
Thinking beyond the project to overall operational improvements provides bigger picture insights into the business outcomes desired. This helps when identifying the format results should be in to maximize their utility in the field and/or for management. In addition, it helps to ensure that the work focuses on those variables that can be controlled to improve operations. Static variables that can’t be changed mean risks cannot be mitigated.
-
Do you currently collect any data related to this?
Understanding your current data will help determine whether additional data collection efforts are needed. You should be able to describe the characteristics of data you have, if any, for the outcome you want to predict (e.g., digital vs. paper/scanned, quantitative vs. qualitative, accuracy, gathered regularly, precision). The benefits and limitations associated with each of these characteristics will be discussed in a future article.
-
What are some risk factors for the desired outcome? What are some things that may increase or decrease the likelihood that your outcome will happen?
These factors are the variables you will use for prediction in the model. It is valuable to brainstorm with all relevant subject matter experts (i.e., management, operations, engineering, third-parties, as appropriate) to get a complete picture. After brainstorming, narrow risk factors based on availability/quality of data, whether the risk factor can be managed/controlled, and a subjective evaluation of risk factor strength. The modeling process will ultimately suggest which of the risk factors significantly contribute to the outcome.
-
What data do you have, if any, for the risk factors you identified?
Again, you need to understand and be able to describe your current data to determine whether it is sufficient to meet your desired outcomes. Using data that have already been gathered can expedite the modeling process, but only if those data are appropriate in format and content to the process you want to predict. If they aren’t appropriate, using these data in modeling can result in time delays or misleading model results.
The versatility of predictive analytics can be applied to help companies analyze a wide variety of problems when the data and desired project outcomes and business/operational improvements are well-defined. With predictive analytics, companies gain the capacity to:
- Explore and investigate past performance
- Gain the insights needed to turn vast amounts of data into relevant and actionable information
- Create statistically valid models to facilitate data-driven decisions
Comments: No Comments
OSHA Clarifies Position on Incentive Programs
OSHA has issued a memorandum to clarify the agency’s position regarding workplace incentive programs and drug testing. OSHA’s rule prohibiting employer retaliation against employees for reporting work-related injuries or illnesses does not prohibit workplace safety incentive programs or post-incident drug testing. The Department believes that these safety incentive programs and/or post-incident drug testing is done to promote workplace safety and health. Action taken under a safety incentive program or post-incident drug testing policy would violate OSHA’s anti-retaliation rule if the employer took the action to penalize an employee for reporting a work-related injury or illness rather than for the legitimate purpose of promoting workplace safety and health. For more information, see the memorandum
Comments: No Comments
Be Our Guest at the Food Safety Consortium
On behalf of our team, Kestrel Management would like to invite you to attend the 6th Annual Food Safety Consortium Conference & Expo on Nov. 13-15 in Schaumburg, IL.
The Consortium is a premiere event for food safety education and networking—and we want to offer you the chance to visit us at the event (booth #119) for a discounted rate (see offer below).
You can accomplish more in two or three days at the Food Safety Consortium than you might otherwise achieve in weeks! Here are five ways the Food Safety Consortium will allow you to enhance your business:
- Get expert advice on specific challenges faced by your business.
- Listen to insights from thought leaders & innovators.
- Stay up-to-date with emerging or changing trends.
- Upgrade your skills, knowledge and on-the-job effectiveness.
- Gain new ideas and insights to grow your business.
Come see Kestrel at booth #119. When you register, use our discount code Cubs and receive a 20% discount off registration.
Our team is proud to be part of the Food Safety Consortium and hope to see you there!
Comments: No Comments
The Four “A’s” of Food Defense
When looking at FSMA, it’s important to look at what we should be doing in industry under FSMA’s prevention scheme. FDA seeks for companies to assess risk and implement preventive controls on a broad basis. Thinking about risk-based strategies, whether in the supply chain, internal systems, or whether you are a grower or an importer, is key for any food company when planning for the future.
From Reactive to Proactive
With the FSMA rules, FDA has moved from reactive to proactive. Preventive strategies are the essence of FSMA. Proactively creating or updating a food defense and safety plan is the first step to ensure compliance.
The four “A’s” of food defense, as outlined below, provide a methodology for building a proactive and comprehensive food defense program.
Step 1: Assess
Assess the risks throughout the supply chain, including to the origin of raw materials. Conduct a vulnerability assessment of weaknesses and critical control points to identify where someone could attempt product adulteration. The focus must be both inside and outside of company walls and extend to the source of materials and services within the supply chain for producers and distributors of food to the public.
Step 2: Access
Who has access to critical control points and food material risk areas? Pay close attention to the four key activity types that FDA has identified as particularly vulnerable to adulteration:
- Mixing and grinding activities that involve a high volume of food with a high potential for uniform mixing of a contaminant
- Ingredient handling with open access to the product stream
- Bulk liquid receiving and loading
- Liquid storage and handling, which is typically located in remote, isolated areas
Restrict access to these areas from suppliers, contractors, visitors, and most employees—limiting access to critical employees only. This provides a higher level of protection, and supports video and/or physical monitoring.
Step 3: Alerts
Alerts of intentional and unintentional food adulteration must be sent to the appropriate individuals, according to the documented food safety and defense program. Response time is critical. Every passing minute is a minute when more health risks could develop, leading to a greater chance of negative impacts on public safety and the related businesses.
Step 4: Audit
Auditing operational and regulatory compliance helps to ensure and maintain best food defense practices and provide documentation of compliance to regulators. FSMA promotes the safety of the U.S. food supply by focusing on prevention, rather than reactive response. Prevention is only as effective as the actual compliance processes put in place. Regular and random auditing, including remote video monitoring, provides evidence confirming that the appropriate preventive measures are taken and effective.
Taking a proactive approach to food defense that follows these four “A’s” will help meet a key requirement by ensuring that the organization is working to avoid the risks associated with food adulteration and contamination.
Comments: No Comments
Court Orders EPA to Implement RMP Rule
The U.S. Court of Appeals for the District of Columbia Circuit ruled on Friday, September 21, 2018 that the EPA must implement the Obama-era Risk Management Plan (RMP) Rule. This comes on the heels of the Court’s ruling on August 17, 2018, which stated that EPA does not have authority to delay final rules for the purpose of reconsideration. Usually, the Court would allow 52 days for the EPA to consider appealing the order and to plan how to implement the rule; however, groups supporting the regulation argued that it can’t wait.
Read more about the currently Court ruling.
USTR Finalizes China 301 List 3 Tariffs
On Monday, September 17, 2018, the Office of the United States Trade Representative (USTR) released a list of approximately $200 billion worth of Chinese imports, including hundreds of chemicals, that will be subject to additional tariffs. The additional tariffs will be effective starting September 24, 2018, and initially will be in the amount of 10 percent. Starting January 1, 2019, the level of the additional tariffs will increase to 25 percent.
In the final list, the administration also removed nearly 300 items, but the Administration did not provide a specific list of products excluded. Included among the products removed from the proposed list are certain consumer electronics products, such as smart watches and Bluetooth devices; certain chemical inputs for manufactured goods, textiles and agriculture; certain health and safety products such as bicycle helmets, and child safety furniture such as car seats and playpens.
Individual companies may want to review the list to determine the status of Harmonized Tariff Schedule (HTS) codes of interest.
NACD Responsible Distribution Cybersecurity Webinar
Join the National Association of Chemical Distributors (NACD) and Kestrel Principal Evan Fitzgerald for a free webinar on Responsible Distribution Code XIII. We will be taking a deeper dive into Code XIII.D., which focuses on cybersecurity and information. Find out ways to protect your company from this constantly evolving threat.
NACD Responsible Distribution Webinar
Code XIII & Cybersecurity Breaches
Thursday, September 20, 2018
12:00 -1:00 p.m. (EDT)
Assessing Risk Management Program Maturity
Maturity assessments are designed to tell an organization where it stands in a defined area and, correspondingly, what it needs to do in the future to improve its systems and processes to meet the organization’s needs and expectations. Maturity assessments expose the strengths and weaknesses within an organization (or a program), and provide a roadmap for ongoing improvements.
Holistic Assessments
A thorough program maturity assessment involves building on a standard gap analysis to conduct a holistic evaluation of the existing program, including data review, interviews with key staff, and functional/field observations and validation.
Based on Kestrel’s experience, evaluating program maturity is best done by measuring the program’s structure and design, as well as the program’s implementation consistency across the organization. For the most part, a program’s design remains relatively unchanging, unless internal modifications are made to the system. Because of this static nature, a “snapshot” provides a reasonable assessment of the design maturity. While the design helps to inform operational effectiveness, the implementation/operational maturity model assesses how completely and consistently the program is functioning throughout the organization (i.e., how the program is designed to work vs. how it is working in practice).
Design Maturity
A design maturity model helps to evaluate strategies and policies, practices and procedures, organization and people, information for decision making, and systems and data according to the following levels of maturity:
- Level 1: Initial (crisis management) – Lack of alignment within the organization; undefined policies, goals, and objectives; poorly defined roles; lack of effective training; erratic program or project performance; lack of standardization in tools.
- Level 2: Repeatable (reactive management) – Limited alignment within the organization; lagging policies and plans; seldom known business impacts of actions; inconsistent company operations across functions; culture not focused on process; ineffective risk management; few useful program or project management and controls tools.
- Level 3: Defined (project management) – Moderate alignment across the organization; consistent plans and policies; formal change management system; somewhat defined and documented processes; moderate role clarity; proactive management for individual projects; standardized status reporting; data integrity may still be questionable.
- Level 4: Managed (program management) – Alignment across organization; consistent plans and policies; goals and objectives are known at all levels; process-oriented culture; formal processes with adequate documentation; strategies and forecasts inform processes; well-understood roles; metrics and controls applied to most processes; audits used for process improvements; good data integrity; programs, processes, and performance reviewed regularly.
- Level 5: Optimized (managing excellence) – Alignment from top to bottom of organization; business forecasts and plans guide activity; company culture is evident across the organization; risk management is structured and proactive; process-centered structure; focus on continuous improvement, training, coaching, mentoring; audits for continual improvement; emphasis on “best-in-class” methods.
A gap analysis can help compare the actual program components against best practice standards, as defined by the organization. At this point, assessment questions and criteria should be specifically tuned to assess the degree to which:
- Hazards and risks are identified, sized, and assessed
- Existing controls are adequate and effective
- Plans are in place to address risks not adequately covered by existing controls
- Plans and controls are resourced and implemented
- Controls are documented and operationalized across applicable functions and work units
- Personnel know and understand the controls and expectations and are engaged in their design and improvement
- Controls are being monitored with appropriate metrics and compliance assurance
- Deficiencies are being addressed by corrective/preventive action
- Processes, controls, and performance are being reviewed by management for continual improvement
- Changed conditions are continually recognized and new risks identified and addressed
Implementation/Operational Maturity
The logical next step in the maturity assessment involves shifting focus from the program’s design to a maturity model that measures how well the program is operationalized, as well as the consistency of implementation across the entire organization. This is a measurement of how effectively the design (program static component) has enabled the desired, consistent practice (program dynamic component) within and across the company.
Under this model, the stage of maturity (i.e., initial, implementation in process, fully functional) is assessed in the following areas:
- Adequacy and effectiveness: demonstration of established processes and procedures with clarity of roles and responsibilities for managing key functions, addressing significant risks, and achieving performance requirements across operations
- Consistency: demonstration that established processes and procedures are fully applied and used across all applicable parts of the organization to achieve performance requirements
- Sustainability: demonstration of an established and ongoing method of review of performance indicators, processes, procedures, and practices in-place for the purpose of identifying and implementing measures to achieve continuing improvement of performance
This approach relies heavily on operational validation and seeking objective evidence of implementation maturity by performing functional and field observations and interviews across a representative sample of operations, including contractors.
Cultural Component
Performance within an organization is the combined result of culture, operational systems/controls, and human performance. Culture involves leadership, shared beliefs, expectations, attitudes, and policy about the desired behavior within a specific company. To some degree, culture alone can drive performance. However, without operational systems and controls, the effects of culture are limited and ultimately will not be sustained. Similarly, operational systems/controls (e.g., management processes, systems, and procedures) can improve performance, but these effects also are limited without the reinforcement of a strong culture. A robust culture with employee engagement, an effective management system, and appropriate and consistent human performance are equally critical.
A culture assessment incorporates an assessment of culture and program implementation status by performing interviews and surveys up, down, and across a representative sample of the company’s operations. Observations of company operations (field/facility/functional) should be done to verify and validate.
A culture assessment should evaluate key attributes of successful programs, including:
- Leadership
- Vision & Values
- Goals, Policies & Initiatives
- Organization & Structure
- Employee Engagement, Behaviors & Communications
- Resource Allocation & Performance Management
- Systems, Standards & Processes
- Metrics & Reporting
- Continually Learning Organization
- Audits & Assurance
Assessment and Evaluation
Data from document review, interviews, surveys, and field observations are then aggregated, analyzed, and evaluated. Identifying program gaps and issues enables a comparison of what must be improved or developed/added to what already exists. This information is often organized into the following categories:
- Policy and strategy refinements
- Process and procedure improvements
- Organizational and resource requirements
- Information for decision making
- Systems and data requirements
- Culture enhancement and development
From this information, it becomes possible to identify recommendations for program improvements. These recommendations should be integrated into a strategic action plan that outlines the long-term program vision, proposed activities, project sequencing, and milestones. The highest priority actions should be identified and planned to establish a foundation for continual improvement, and allow for a more proactive means of managing risks and program performance.
Comments: No Comments
Federal Court Overturns RMP Delay
The EPA’s Risk Management Plan (RMP) Rule (Section 112(r) of the Clean Air Act Amendments) has garnered a lot of attention as its status as a rule has fluctuated since the RMP Amendments were published under the Obama Administration on January 13, 2017.
The latest development in the RMP saga came on August 17, 2018, when the U.S. Court of Appeals for the District of Columbia Circuit ruled that EPA does not have authority to delay final rules for the purpose of reconsideration.
Background
The original RMP Amendments of 2017 were developed in response to Executive Order (EO) 1365, Improving Chemical Safety and Security, and intended to:
- Prevent catastrophic accidents by improving accident prevention program requirements
- Enhance emergency preparedness to ensure coordination between facilities and local communities
- Improve information access to help the public understand the risks at RMP facilities
- Improve third-party audits at RMP facilities
However, after EPA published the final rule, many industry groups and several states filed challenges and petitions, arguing that the rule was overly burdensome, created potential security risks, and did not properly coordinate with OSHA’s Process Safety Management (PSM) standard.
Under the Trump administration, EPA delayed the effective date of the rule by 20 months—until February 2019—and announced its plan to reconsider the rule’s provisions. On May 30, 2018, the RMP Reconsideration Proposed Rule was published and proposed to:
- Maintain consistency of RMP accident prevention requirements with the OSHA PSM standard
- Address security concerns
- Reduce unnecessary regulations and regulatory costs
- Revise compliance dates to provide necessary time for program changes
Recent Court Actions
In the most recent actions, the federal court ruled that the EPA can no longer delay enforcement of the RMP Rule. In its court opinion, the judges cited that the delay “makes a mockery of the statute” because it violates the CAA requirement to “have an effective date, as determined by the Administrator, assuring compliance as expeditiously as practicable.” The court further stated that the delay of the rule was “calculated to enable non-compliance.”
EPA Administrator Scott Pruitt countered that the EPA needed more time to weigh concerns, particularly those about security risks associated with chemical facilities disclosing information to the public.
The judges have noted that EPA can still substantively revise the RMP Rule and its compliance deadline(s); however, they reinforced that in the CAA, “Congress is seeking meaningful, prompt action by EPA to promote accident prevention.”
What’s Next?
The RMP Rule will not take effect immediately; EPA has time to appeal the decision and petition for rehearing. The earliest that the RMP Amendments (as originally published) could realistically go into effect is October 2018. Based on this, effective dates for requirements contained in the RMP Amendments would be as follows:
- Effective immediately: 3-year compliance audits in each covered process at the facility (original date: March 14, 2017)
- Effective immediately: Duty to coordinate emergency response activities with local emergency responders (original date: March 14, 2018)
- March 14, 2020: Emergency Response Program revisions
- March 15, 2021: Third-party auditor requirements; incident investigation and root cause analyses; safer technology and alternatives analyses/IST provisions; emergency response exercise; public availability of information
- March 14, 2022: Revised elements of RMP provisions in Subpart G
If a rehearing is granted, the timeline would likely extend further into the future. Meanwhile, comments are due on the RMP Reconsideration Proposed Rule on Thursday, August 23, 2018. Kestrel will continue to monitor developments with the RMP Rule, as its final status remains a moving target.
Audit Program Best Practices: Part 2
Audits provide an essential tool for improving and verifying compliance performance. As discussed in Part 1, there are a number of audit program elements and best practices that can help ensure a comprehensive audit program. Here are 12 more tips to put to use:
- Action item closure. Address repeat findings. Identify patterns and seek root cause analysis and sustainable corrections.
- Training. Training should be done throughout the entire organization, across all levels:
- Auditors are trained on both technical matters and program procedures.
- Management is trained on the overall program design, purpose, business impacts of findings, responsibilities, corrections, and improvements.
- Line operations are trained on compliance procedures and company policy/systems.
- Communications. Communications with management should be done routinely to discuss status, needs, performance, program improvements, and business impacts. Communications should be done in business language—with business impacts defined in terms of risks, costs, savings, avoided costs/capital expenditures, benefits. Those accountable for performance need to be provided information as close to “real time” as possible, and the Board of Directors should be informed routinely.
- Leadership philosophy. Senior management should exhibit top-down expectations for program excellence. EHSMS quality excellence goes hand-in-hand with operational and service quality excellence. Learning and continual improvement should be emphasized.
- Roles & responsibilities. Clear roles, responsibilities, and accountabilities need to be established. This includes top management understanding and embracing their roles/responsibilities. Owners of findings/fixes also must be clearly identified.
- Funding for corrective actions. Funding should be allocated to projects based on significance of risk exposure (i.e., systemic/preventive actions receive high priority). The process should incentivize proactive planning and expeditious resolution of significant problem areas and penalize recurrence or back-sliding on performance and lack of timely fixes.
- Performance measurement system. Audit goals and objectives should be nested with the company business goals, key performance objectives, and values. A balanced scorecard can display leading and lagging indicators. Metrics should be quantitative, indicative (not all-inclusive), and tied to their ability to influence. Performance measurements should be communicated and widely understood. Information from auditing (e.g., findings, patterns, trends, comparisons) and the status of corrective actions often are reported on compliance dashboards for management review.
- Degree of business integration. There should be a strong link between programs, procedures, and methods used in a quality management program—EHS activities should operate in patterns similar to core operations rather than as ancillary add-on duties. In addition, EHS should be involved in business planning and MOC. An EHSMS should be well-developed and designed for full business integration, and the audit program should feed critical information into the EHSMS.
- Accountability. Accountability and compensation must be clearly linked at a meaningful level. Use various award/recognition programs to offer incentives to line operations personnel for excellent EHS performance. Make disincentives and disciplinary consequences clear to discourage non-compliant activities.
- Deployment plan & schedule. Best practice combines the use of pilot facility audits, baseline audits (to design programs), tiered audits, and a continuous improvement model. Facility profiles are developed for all top priority facilities, including operational and EHS characteristics and regulatory and other requirements.
- Relation of audit program to EHSMS design & improvement objectives. The audit program should be fully interrelated with the EHSMS and feed critical information on systemic needs into the EHSMS design and review process. It addresses the “Evaluation of Compliance” element under EHSMS international standards (e.g., ISO 14001 and OHSAS 18001). Audit baseline helps identify common causes, systemic issues, and needed programs. The EHSMS addresses root causes and defines/improves preventive systems and helps integrate EHS with core operations. Audits further evaluate and confirm performance of EHSMS and guide continuous improvement.
- Relation to best practices. Inventory best practices and share/transfer them as part of audit program results. Use best-in-class facilities as models and “problem sites” for improvement planning and training. The figure below illustrates an audit program that goes beyond the traditional “find it, fix it, find it, fix it” repetitive cycle to one that yields real understanding of root causes and patterns. In this model, if the issues can be categorized and are of wide scale, the design of solutions can lead to company-wide corrective and preventive measures. This same method can be used to capture and transfer best practices across the organization. They are sustained through the continual review and improvement cycle of an EHSMS and are verified by future audits.
Read the part 1 audit program best practices.