Human Activity Recognition (HAR) has emerged as a crucial assistive technology in elderly healthcare, providing caregivers with the ability to monitor and assist in daily activities. HAR is typically conducted through the analysis of data obtained from various types of sensors, including body, object, and ambient sensors. This study focused specifically on utilizing data from internal sensors to recognize activities such as walking, ascending/descending stairs, sitting, standing, and falling. Data was gathered from smartphones, and all HAR models were tested using real-time data collected from Wireless Multimedia Sensor Networks (WMSNs). To perform activity recognition, neural network algorithms such as CNN and LSTM, along with traditional machine learning classification algorithms such as SVM, KNN, and Random Forest Classifier were employed. Additionally, dimension reduction techniques are utilized to decrease the number of features, thus reducing computational time and energy consumption. Furthermore, transfer learning was employed with different scenarios to improve accuracy. Finally, all functions were implemented and compared in two different WMSN architectures. The results enabled us to identify the most efficient and accurate methods for HAR utilizing data from internal sensors. This research has the potential to enhance the quality of life for the elderly and provide caregivers with the necessary tools to provide better care.