Practice Amidst COVID-19 Pandemic
Main Article Content
Abstract
Abstract. The Philippine government postponed the face-to-face learning modality due to the COVID-19 pandemic, which was crucial for the application of knowledge, skills and attitude in clinical settings among nursing students. This study attempted to determine the level of readiness of the nursing students of the University of Perpetual Help System DALTA-Calamba Campus in clinical practice amidst COVID-19 pandemic. Further, the study delved into the significant difference in the level of readiness of nursing students when they are grouped according to profile. This study used a descriptive quantitative research design. The total population of the study was 23 nursing students from 3rd year and 4th year of the University of Perpetual Help System DALTA-Calamba Campus in the Academic Year 2022-2023. Purposive sampling was used in this study. The Casey-Fink Readiness for Practice Survey was adapted for this study. The findings presented that 69.6% were aged 20-22, 78.3% were female, and 60.9% belonged to third year. The overall weighted mean of 2.98 revealed that the level of readiness among nursing students was high. There was no significant difference in the level of readiness for Related Learning Experience between the nursing students with ages ranging from 20-22 and 23-25, as the computed p-value of 0.868 is greater than the 0.05 level of significance, this implied that age was not a factor in the level of readiness for the clinical practice of the nursing students. There was no significant difference in the level of readiness in clinical practice between the male and female nursing students, since the computed p-value of
1.00 is greater than the 0.05 level of significance, this inferred that sex was not a factor in the level of readiness for the clinical practice of the nursing students. Lastly, there was no significant difference in the level of readiness for clinical practice between 3rd year and 4th year nursing students, whereas the computed p-value of 0.610 is greater than the 0.05 level of significance, hence the hypothesis was accepted that there was no significant difference on the level of readiness in clinical practice when grouped according to profile.
Downloads
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
References
Asaad, R., & Ali, R. (2019). Back Propagation Neural Network(BPNN) and Sigmoid Activation Function in Multi-Layer Networks.Academic Journal Of Nawroz University,8(4), 216. doi: 10.25007/ajnu.v8n4a464.[2] Zhang C., Zhang Z.A, (2010). Survey of Recent Advances in Face Detection.Microsoft Corporation; Albuquerque, NM, USA. TechReport, No. MSR-TR-2010-66.[3] 3.Ekman P., Friesen W., Hager J.(2002). Facial Action Coding System: The Manual on CD ROM.A Human Face; Salt Lake City, UT, USA.[4] Li, M., Zang, S., Zhang, B., Li, S., & Wu, C. (2014). A review of remote sensing image classification techniques: The role of spatial-contextual information.European Journal of Remote Sensing,47(1), 389-411.[5] Kwon, O. W., Chan, K., Hao, J., & Lee, T. W. (2003). Emotion recognition by speech signals. InEighth European Conference on Speech Communication and Technology.[6] Schuller, B., Rigoll, G., & Lang, M. (2003, April). Hidden Markov model-based speech emotion recognition. In2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings.(ICASSP'03).(Vol. 2, pp. II-1). IEEE.[7] El Ayadi, M., Kamel, M. S., & Karray, F. (2011). Survey on speech emotion recognition: Features, classification schemes, and databases.Pattern Recognition,44(3), 572-587.[8] Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human-computer interaction.IEEE Signal processing magazine,18(1), 32-80.[9] Nwe, T. L., Foo, S. W., & De Silva, L. C. (2003). Speech emotion recognition using hidden Markov models.Speech communication,41(4), 603-623.[10] Busso, C., Lee, S., & Narayanan, S. (2009). Analysis of emotionally salient aspects of fundamental frequency for emotion detection.IEEE transactions on audio, speech, and language processing,17(4), 582-596