Investigating the Dip VET Unknown Unknowns: TAEPDD501
If you missed the background, we have decided to take a look at what experienced training professionals do, and compare that with what the TAE50116 Dip VET says they should be doing. You can read more about that HERE. Following on from our first articles about four core units (TAEASS501, TAEDEL502, TAEASS502, TAELLN501), we have now collected some descriptive data related to another core unit from within the TAE Diplomas, TAEPDD501 – Maintain and enhance professional practice.
The following table shows the summary of assessment outcomes in this unit.
For the sake of comparison, we will include in this table (and subsequent ones) the data for the earlier units for which we have collected data. For Performance Evidence, the figures represent the average percentages of benchmarks that were satisfactorily demonstrated and the range for that across all 50 sample candidates. For the Knowledge Evidence, the figures represent the percentage who satisfactorily demonstrated all the benchmarks.
For the unit, TAEPDD501, of the 50 RPL candidates whose submissions were sampled, the average number of Performance Evidence benchmarks that were demonstrated satisfactorily was 70%. This is close to the average of 68.4% that has emerged from the five units thus examined. Compared to other Core units within the Dip VET, it is stronger than TAEASS501 and TAELLN501, but weaker than TAEASS502 and TAEDEL502.
Looking at the range, among these 50 candidates it was 50-100%, which is closest to the data for TAEASS502. The issues surrounding trainer capacity in this particular part of their work is further foregrounded within the Knowledge Evidence area, where less than two-thirds of experienced trainers/assessors satisfactorily demonstrated the required Knowledge.
Performance in the individual benchmarks.
The following table lists the benchmarks used for assessing Performance Evidence of RPL candidates; the right-hand column indicates the percentage of candidates who demonstrated it satisfactorily.
We will allow those data to speak for themselves, and move to the Discussion.
While these data are very interesting, and while it may be simple to draw inferences from and conclusions about them, we must exercise caution without first conducting further analyses. Having said that, we do notice the following three things:
- in simple terms, there appear to be lots of discussion among peers regarding both delivery and assessment practice. This is perhaps expected among a group of people who have chosen to seek RPL for this unit, since this requirement is fairly well known, and understood.
- documentation of reflective activity surrounding assessment is much more robust than around delivery. Anecdotally, this is possibly due to the higher regulatory emphasis on recording ‘all things assessment’. By contrast, critical incidents related to assessment were less frequently identified than were critical incidents related to delivery. Perhaps confusing things further here, is that opportunities to improve were more evident with regards delivery than with assessment practice.
- an average of half of those trainers/assessors did not appear to base their PD on research, consulting and networking activities, and about a third did not seem to make a connection between PD activities and broader organisational and industry needs.
What next for this investigation into Dip VET RPL outcomes?
We will continue to analyse data related to these and the other two Core units from within the TAE50116 Diploma of Vocational Education & Training. Ultimately, we are seeking to identify:
- those components that are commonly not demonstrated through an assessment based on RPL.
- if there exist any statistically significant differences in these ‘gaps’ between those RPL candidates who have more or less than 5 years experience, and who perform roles that are coal-face or back-office roles.
We expect this investigation to be completed by about August 2018. Results will be published at Fortress Learning and in relevant journal/s,