I've been deep-diving into an intriguing question while working on my side project and during my studies.
I experimented with predicting the length of my future phone usage sessions (from unlock to lock), and the outcomes have been pretty remarkable.
I got my usage data in format
timestamp_of_session, duration_in_seconds
By applying a straightforward linear regression model, I achieved a Mean Squared Error of just around 3 minutes. Isn't that fascinating?
I'm really excited to hear your thoughts and insights on this! Thanks in advance for your input!
And what do you mean by MSE of around 3 minutes? Do you mean root mean squared error?