This is to use forecasting methods for The National Red Cross Organization, in which I compare and contrast the following methods:
Seasonal, Delphi, Technological, Time Series
This is to explain how the The National Red Cross Organization of the United States uses each of these methods to forecast demand under conditions of uncertainty. I would like your thoughts for use in comparing and contrasting the four above mention methods.
Clarification: I want to know how each of these techniques are used by Red Cross and then a comparison.© BrainMass Inc. brainmass.com October 24, 2018, 7:37 pm ad1c9bdddf
Clarification: I want to know how each of these techniques are used by Red Cross and then a comparison.
What is the delphi method? : A panel of experts who anonymously answer a series of questions develops Forecast; responses are fed back to panel members who then may change their original responses Kindly think in the following terms: This method is:
- very time consuming and expensive
- new groupware makes this process much more feasible
The American Red Cross has used this committee: The committee consisted of over 60 members, approximately 15 to 20 of who attended any monthly meeting. Please think of the following The committee used a modified Delphi technique to identify critical components of the materials that support the national asthma guidelines. They began with a review of the NHLBI guidelines, listing each of the key elements or teaching points. During a period of 6 months and through more than 20 revisions, the committee identified 135 key elements that were then incorporated into a 28-item evaluation tool. Each of the items was to be rated on a scale of 1 (topic not covered) to 5 (exceeds criteria). An item had to receive a rating of > 2.5 to meet the CAC standard of "acceptable."
Please consider the following : Armed with this new tool, 14 committee members, well-versed in the intent of the tool, rated 133 different asthma education materials, each of which was reviewed by at least two members. As the completed evaluations were tallied, materials with widely disparate ratings underwent a third review. Amazingly, the scores of the reviewer pairs were within 20% of each other.
You need to understand this: Of the 133 materials reviewed, only the NHLBI guidelines themselves met all of the criteria as determined by the committee. Based on these findings, the committee was concerned that they might have set too high a standard, and raised the question of whether all criteria should be given equal weight. Please reflect on the following: They concurred that it was possible to identify a subset of 10 items that they believed were critical in meeting minimal acceptable standards. When they applied this new subset of criteria, 21 of the patient education materials met minimum acceptable standards.
Kindly reflect on this: The committee members were surprised to find that 6 years after the release of the national asthma education guidelines, relatively few of the materials available from advocacy groups, clinicians, or industry effectively taught the key elements of the guidelines.
They believed that they should not only share their findings with the consortium members, but also offer feedback to other interested organizations. To that end, the committee now offers members and interested parties technical assistance in reviewing their asthma education materials. The committee identifies both strengths and key topic-area deficiencies in existing materials in an effort to promote the development of comprehensive materials and teaching programs for patients, families, and providers.
Kindly think in the following terms: Since the development of the evaluation tool, several ...
The American Red Cross is studied.
Discussion questions to consider
The following questions are from the Handbook of Training Evaluation and Measurement Methods by Jack J. Phillips. I could use some other feedback/insight with these questions.
SOME ORGANIZATIONS NEED A ROI PROCESS THAT WILL LOOK FORWARD (FORECAST) AND BACKWARD (ROI RESULTS). CAN THIS MODEL BE USED IN BOTH SCENARIOS? EXPLAIN.
CAN THIS ROI PROCESS MODEL BE USED WITH ALL TYPES OF PERFORMANCE INTERVENTIONS? PROVIDE EXAMPLES OF DIVERSE APPLICATIONS, ADDRESSING THE FIRST THREE BLOCKS IN THE MODEL.
ONE CRITICISM OF AN ROI MEASUREMENT IS THAT IT DOES NOT PRESENT A "BALANCED" VIEW OF RESULTS. HOW DOES THIS MODEL ADDRESS THIS ISSUE?
WHY IS IT IMPORTANT TO OBTAIN PARTICIPANT FEEDBACK IN TRAINING PROGRAMS?
ONE HRD MANAGER COMMENTED ".....SMILE SHEETS (PARTICIPANT FEEDBACK FORMS) ARE NOT WORTH THE TIME IT TAKES TO COMPLETE THEM. THESE HAPPINESS RATINGS DO NOTHING BUT FEED THE EGO OF THE INSTRUCTOR." IS THIS TRUE? EXPLAIN.
DESIGN A PARTICIPANT FEEDBACK FORM FOR A ONE-DAY WORKSHOP ON IMPROVING INTERVIEWING SKILLS.
WHAT ARE THE ADVANTAGES AND DISADVANTAGES OF ATTITUDE SURVEYS IN EVALUATION?
WHY ARE PERFORMANCE TESTS IMPORTANT TO HRD EVALUATION?
DESIGN AND DESCRIBE A PERFORMANCE TEST TO MEASURE THE EFFECTIVENESS OF A TRAINING PROGRAM DESIGNED TO IMPROVE INTERVIEWING SKILLS.
USE EXAMPLES TO ILLUSTRATE THE DIFFERENCES AMONG EVALUATION INSTRUMENT DESIGN, EVALUATION DESIGN, AND DATA COLLECTION METHODS.
WHY ARE FOLLOW-UP EVALUATIONS IMPORTANT?
DESIGN A FOLLOW-UP QUESTIONNAIRE FOR A THREE-DAY PERFORMANCE MANAGEMENT WORKSHOP FOR MIDDLE MANAGERS. MAKE ANY ASSUMPTIONS NECESSARY TO COMPLETE THE ASSIGNMENT.
DO ORGANIZATIONS USE PERFORMANCE CONTRACTS ON A ROUTINE BASIS? EXPLAIN.
WHAT ADVANTAGES DO ACTION PLANS HAVE OVER QUESTIONNAIRES?
WHAT ARE THE WEAKNESSES OF THE INTERVIEW PROCESS?
WHEN IS IT APPROPRIATE TO USE OBSERVATIONS IN THE EVALUATION PROCESS? WHAT ARE THE DIFFERENCES IN THE INTERVIEW AND FOCUS GROUP PROCESS?
WHAT ARE THE PROBLEMS WITH FOCUS GROUP PROCESS?
WHEN IS IT APPROPRIATE TO USE OBSERVATIONS IN THE EVALATION PROCESS?
WHAT DIFFICULTIES ARE ENCOUNTERED WHEN USING EVALUATION INFORMATION OBTAINED FROM OBSERVERS?
WHICH IS THE MOST EFFECTIVE DATA-COLLECTION METHOD? WHY?
IS IT POSSIBLE TO USE ALL DATA-COLLECTION METHODS IN THE EVALUATION OF A SINGLE PROGRAM? EXPLAIN.View Full Posting Details