Opinion & Essays - November, 2001 Issue #87
|
WHAT PARENTS ARE SAYING
By Lon Woodbury
Last year, Woodbury Reports Inc. started circulating surveys to
parents who had placed at-risk children in residential schools and programs. The goal was to get some insight into the attitudes of
parents and students who had been involved with residential placement. We were also looking for some indication of what the next step
might be for further research. As of mid-October, we had collected 268 completed surveys. Dr. Carol Maxym and I presented our initial
findings at the Fall Independent Educational Consultants Association (IECA) Conference in Tampa, Florida. This article is a summary
of that presentation. A copy of the actual survey is available online by clicking
here. We felt that an important aspect of the research was that virtually all the questions on the survey were open ended. Since
no suggestions were offered for respondents to select for their answers, the words and phrases were in the respondent’s own words;
their answers could not be influenced by any descriptions provided on the form.
Our findings are tentative, which is very important to remember; the responses were not scientifically selected, and open-ended questions
are difficult to evaluate and rate. However there are some interesting patterns that emerged from the early data
Of the respondents, 198 were mothers, 45 were fathers and 18 had been students. 127 responses represented students who had graduated
from their school/program, 66 had left early, and 72 were still attending.
Responses covered 115 different schools/programs with a wide range of approaches:
Drug Treatment = 9, Short term wilderness = 47, long term wilderness = 18, behavioral modification = 35, RTC/Psychiatric = 37.
At the end of the survey, each respondent was asked to give an overall rating, ranging from a 0, which indicates they thought the
program was harmful, to a 5, given when the respondent rated the program as very effective and very appropriate. Overall the respondents
were a satisfied group, with 145 rating their program as a 5. 23 respondents gave a 0 rating to their program, with the average rating
from all respondents being 3.82. Yet, these ratings also show that about 1/3 of the respondents were negative or unenthusiastic in
their approval, representing a fairly large proportion of the total survey group. This could give cause for concern considering the
effort and money the parents put forth for a program with which they only felt modest approval. It would indicate that it is likely
they don’t feel they got their money’s worth.
We looked for any patterns that were suggested by the responses to a given question, and compared them with how satisfied the respondents
were overall. This was analyzed by averaging each group’s ratings, however we didn’t have time to make a lot of comparisons that might
have been interesting. Possible patterns that seemed to be suggested by the figures are described below. Our calculations were supplemented
by several in depth interviews Dr. Carol Maxym did with various respondents.
When asked what behaviors led to a placement decision (multiple answers were common), the most frequent were: drugs = 158, School
failure = 122, and rebellion=105. All other terms were used less than half as frequently as these three, and included terms such as:
law breaking = 47, anger = 53, negative friends = 25 and running away = 37. The highest program satisfaction averages were from respondents
who identified the major reasons for placement being: suicide = 4.6, sexual permissiveness = 4.73, and running away = 4.24; all behaviors
that can be prevented by a high degree of structure with close staff supervision. The lowest ratings of satisfaction were expressed
by those whose reasons for placement were due to these behaviors: Bi-Polar = 3.5, rebellion = 3.67, and school failure = 3.71. The
low ratings for programs came from respondents whose reasons for seeking placement seemed to relate to attitudes, which is something
more abstract and harder for a school to impact, and pathology, perhaps suggesting that inappropriate placements are occurring too
frequently.
According to the survey, parents learned of the program they selected primarily as a result of three types of resources: educational
consultants = 76, Woodbury Reports publications = 87, and word of mouth = 55. The high number of respondents mentioning Woodbury Reports
is probably artificially high, simply reflecting that we were more likely to get responses from people who had used our publications
or consultation services. Looking at satisfaction averages, the highest ratings were from the 23 respondents who learned of programs
from Local Child Care Professionals. These respondents gave their programs an average rating of 4.26, while the lowest rating, 3.00,
was given by the 16 respondents who had learned about their program as a result of another program.
When asked what the strengths of their program were, the answers overwhelmingly indicated that the staff was the most positive aspect.
160 respondents answered “staff”, often naming specific individuals. Many used other terms that described staff functions, such as:
communication = 24, parent support = 29, structure = 57, and therapy = 47. The same pattern showed in respondent ratings of various
aspects of the program, with functions carried out by staff members being rated as a very positive as program element, for example,
aftercare = 4.75, communication = 4.46, consistency = 4.56, emotional growth = 5.0, and program planning = 4.54.
A similar pattern showed in the answers provided by the respondents when they were asked about weak points of the program. That is,
the most common weaknesses that were identified, also related to staff. In this case, the following number of respondents identified
the program weaknesses as follows: 56 indicated staff, 62 = communication, and 23 = turnover. In satisfaction averages, the lowest
rating for program weakness was its therapy = 1.67. “Misled” received a rating of 1.89, suggesting promising more than can be delivered
can have drastic consequences, at least in the opinion of some of the respondents. It also seems some respondents perhaps had a desire
for more therapy.
Overall, there was a suggestion that the opinion of credible sources played an important role when making placement decisions. This
is indicated by the high rate of satisfaction among those who found programs through Local Child Care professionals such as psychiatrists
and therapists, as well as the relatively high rates of satisfaction indicated by those using independent educational consultants.
Another observation Dr. Carol Maxym made as a result of her discussions was that parents tended to use diagnostic terms in their verbal
conversations, but in their written responses, parents used a somewhat different vocabulary. Actually, the most commonly used words
that we found among the responses might turn out to be the most helpful result of the survey. It might suggest that since parents
are thinking in terms of a different vocabulary, perhaps one of common sense, rather than in terms of a diagnostic mind set. It could
be that Parents perhaps are speaking a somewhat different language than are professionals, which could lead to a lot of subtle but
important miscommunication. It could be extremely helpful in building a team that is on the same course, if a professional uses the
same terms in his or her speech that parents are using to conceptualize the problem in their own minds.
These results reflect a very preliminary step in our project and we are still collecting surveys. We encourage any parents or ex-
students who have been involved in residential schools and programs for at-risk kids to go to fill out the survey.
The more people who fill out surveys, the more likely it will be that the patterns suggested by the data are truly valid and the more
useful information this project will provide. We would very much appreciate the help.
|