Case Study: Deutsche Electroencephalographic Executive Summary: This case study Involved the analysis of the Statistical Process Control of documents for DAD, a company looking to stay ahead of the rest In today’s competitive market. Specifically, the company Is looking to Improve their process of documenting customer Information In forms both filled by customers, as well as by representatives. What is important to note here, is that documents can have errors (as they often Dobb however, these errors aren’t necessarily all of the same importance to the company.
The challenge here was to implement a relevant method which old actually yield improvements in the long run, which is why the company chose to establish P-charts. In essence these p-charts lay out whether a document is correct or has errors, and displays them on a graph. Through the involvement of all departments of the company, these errors can be analyzed, and information can be acquired on specific sectors of the company In need of help. Throughout the case study we made recommendations addressing the potential challenges which a P- charting process brings to the table.
Don’t waste your time!
Order your assignment!
We also made specific suggestions addressing the gray area of what constitutes a mistake In a document. . DAVE is using SSP because it’s a mathematical method which is easily analyzable as a hard science method. This method is usually easily implemented in the manufacturing industry since one is usually looking at parts which have precise measurement requirements. In our case, however, we run in to the issue that DAVE is looking at forms which are filled out by the customers themselves. For starters, these forms have varying errors, from wrong addresses, as an example, to wrong telephone numbers.
Also, each form could have multiple errors, therefore this poses the question of what exactly constitutes and error. All of these “variables” need to be defined clearly beforehand so that error counts are accurate, and mistakes such as double-counting are avoided. 2. Basically a P-chart Is used to analyze any process with the goal of finding how many errors are In the process, For example: Let’s say that we work for a bakery which sells cookies in batches of 10. Each batch of 10 cookies is made out of exactly 2 lb. Of dough, with the catch that our customers expect exactly 10 chocolate chips per cookie.
Since we have lots of orders being placed we need to speed up the process of assigning 10 chocolate chips to each cookie. Therefore, we simply toss 100 chocolate chips into every batch, and mix it up really well before portioning each batch into 10 individual cookies. As you might imagine, no matter how well you mix up each batch, you are going to get cookies which have more than 10 chocolate chips, and cookies which have less. Since our customers expect exactly 10 chocolate chips per cookie, we can say that all cookies with less than 10 chips are defective.
Therefore, when looking at a batch of 10 cookies, we can observe a ratio of conforming cookies to defective cookies, which Is essentially what a p chart does. Looking at the chart below we can see the example of what one batch of cookies might look like after its cut up and baked: 10 Number of chocolate chips 11 12 From this chart we can see that out of the 10 cookies baked, 2 are defective. So 80% AT ten cookies are contorting, wanly would De snow In a p -chant 3. The size of each sample taken depends on its respective accuracy rate.
If a document tends to have a lot of errors, then less samples will need to be taken in order to get accurate data of the number of errors per sample. However, if a document is done correctly 99% of the time, one is going to need lots of samples in order to get those errors to appear. The problem with this is that it prejudices processes which are done correctly most of the time, forcing the people in charge of them to have to take lots of samples when really, they should be the ones rewarded for doing their Jobs correctly.
Equation 8-5 from the Fundamentals of Quality Control and Improvement can be used: 4. From the 12 week diagnostic period, the Policy Extension Group has an average of 15. 667 errors per 300 samples. The 3 sigma control limits can be seen below in Figure 1 and show the variation of number of errors seen while filling out documents. In weeks 23 and 24 the process was above the 3 sigma control limit and therefore out of control. These two weeks show that something out of the ordinary occurred and led to a large increase in number of errors seen per 300 samples. . Better teams do more sampling: In order to address this issue, we suggest implementing a random assignment policy, where regardless of what document type people are in charge of, they will be assigned a random type to sample, also, these samples should be anonymous until completed. This avoids putting the heaviest workloads on the people in charge of the processes which are less prone to errors, as ell as avoiding people falsifying their process in order to try and show they have fewer errors. When is a mistake not a mistake? This issue can be dealt with simply by having the management team meet all together and come up with sound rules to be applied universally. If certain data is more important to a specific document than another, then they could also come up with a weighing system, our suggestion, however, is to simply strive for the least number of errors all together since the processes should be done correctly in the first place. Measuring lawyers: Once again this is up to the managers in charge of the lawyers.
Our suggestion is to have the team in charge of the lawyers come up with a relevant system to measure their work. If whatever they do is hard to measure numerically such as in what is done “right” and what is done “wrong” then maybe a system which measures how long it took them to accomplish their task would be more appropriate. One could install a system that weighs the complexity of each customer’s legal issue, and puts it up against the amount of time taken to resolve.
This would scale all of the problems evenly so that a complex problem is assumed to need more time than an easy one. Over time this recess could be improved as managers would obtain a better sense for what kind of timing is appropriate. Automatic Charting: This issue seems somewhat resolved, the IT department could write a program to make the charting of the processes automatic, saving ten time AT tense won are long It manually. Also, ten company could train and hire people to dedicate themselves to the charting if making a program is not an option.
This would be costly at first, but pay off over time if the charting was as tedious as described. On the prowl: We do not think one should complicate a process in order to make it less easily understandable, on the contrary, he charts coming out of this reform should be intuitive. In order to avoid managers judging modestly performing processes we suggest that the Human Resources department could hold meetings to explain to managers why certain processes might look to be performing less well than others.
Since we are dealing with forms which are completed manually, it’s natural for some to have more errors than others; it’s in the nature of the variability within each type of form. Therefore, by educating them on these differences, they might understand that for example; department As numbers, who deal with really complicated forms with a lot of information, are good if around 80%, while department Bi’s numbers, who deal with really easily filled forms with not a lot of information, are good around 95%. . To begin improving the process one could move in many different directions, however, we thought it would be most important to begin by setting a goal. For starters we suggest coming up with a reasonable, yet challenging percentage of non- conforming documents to be acceptable as a target. Once a target is set, it would be important for each branch of the company to identify which documents are most radical.
Once these two steps are done, management could proceed to coming up with a grading scale for errors, proportioning the data which are most important in each document. Finally, each department could begin sampling and coming up with results for their p-charts, observing areas which need most immediate improvement. Lastly, once areas are identified the company needs to focus on coming up with ways to fix these errors and bring each document up to standards, and to not hold back once they reach their first target, improvement should always be continuous and never-ending.