A Machine Learning based location recording and activity detection framework for iOS.
- Combined, simplified Core Location and Core Motion recording
- Filtered, smoothed, and simplified location and motion data
- Near real time stationary / moving state detection
- Automatic energy use management, enabling all day recording
- Automatic stopping and restarting of recording, to avoid wasteful battery use
- Machine Learning based activity type detection
- Improved detection of Core Motion activity types (stationary, walking, running, cycling, automotive)
- Distinguish between specific transport types (car, train, bus, motorcycle, airplane, boat)
- Optionally produce high level
Path
andVisit
timeline items, to represent the recording session at human level. Similar to Core Location'sCLVisit
, but with much higher accuracy, much more detail, and with the addition of Paths (ie the trips between Visits). - Optionally persist your recorded samples and timeline items to a local SQL based store, for retention between sessions.
More information about timeline items can be found here
LocoKit is an LGPL licensed open source project. Its ongoing development is made possible thanks to the support of its backers on Patreon.
If you have an app that uses LocoKit and is a revenue generating product, please consider sponsoring LocoKit development, to ensure the project that your product relies on stays healthy and actively maintained.
Thanks so much for your support!
pod 'LocoKit'
pod 'LocoKit/LocalStore' # optional
Note: Include the optional LocoKit/LocalStore
subspec if you would like to retain your samples
and timeline items in the SQL persistent store.
// retain a timeline manager
self.timeline = TimelineManager()
// start recording, and producing timeline items
self.timeline.startRecording()
// observe timeline item updates
when(timeline, does: .updatedTimelineItem) { _ in
let currentItem = timeline.currentItem
// duration of the current Path or Visit
print("item.duration: \(currentItem.duration)")
// activity type of the current Path (eg walking, cycling, car)
if let path = currentItem as? Path {
print("path.activityType: \(path.activityType)")
}
// examine each of the LocomotionSamples within the Path or Visit
for sample in currentItem.samples {
print("sample: \(sample)")
}
}
// the recording manager singleton
let loco = LocomotionManager.highlander
// decide which Core Motion features to include
loco.recordPedometerEvents = true
loco.recordAccelerometerEvents = true
loco.recordCoreMotionActivityTypeEvents = true
// decide whether to use "sleep mode" to allow for all day recording
loco.useLowPowerSleepModeWhileStationary = true
Note: The above settings are all on by default. The above snippets are unnecessary, and just here to show you some of the available options.
// start recording
loco.startRecording()
// watch for updated LocomotionSamples
when(loco, does: .locomotionSampleUpdated) { _ in
// the raw CLLocation
print(loco.rawLocation)
// a more usable, de-noised CLLocation
print(loco.filteredLocation)
// a smoothed, simplified, combined location and motion sample
print(loco.locomotionSample())
}
If you wanted to get all timeline items between the start of today and now, you might do this:
let date = Date() // some specific day
let items = store.items(
where: "deleted = 0 AND endDate > ? AND startDate < ? ORDER BY endDate",
arguments: [date.startOfDay, date.endOfDay])
You can also construct more complex queries, like for fetching all timeline items that overlap a certain geographic region. Or all samples of a specific activity type (eg all "car" samples). Or all timeline items that contain samples over a certain speed (eg paths containing fast driving).
Note that if you are using a TimelineManager
, activity type classifying is already handled
for you by the manager, on both the sample and timeline item levels. You should only need to
directly interact with clasifiers if you are either not using a TimelineManager, or are wanting
to do low level processing at the sample level.
// fetch a geographically relevant classifier
let classifier = ActivityTypeClassifier(coordinate: location.coordinate)
// classify a locomotion sample
let results = classifier.classify(sample)
// get the best match activity type
let bestMatch = results.first
// print the best match type's name ("walking", "car", etc)
print(bestMatch.name)
Note: The above code snippets use SwiftNotes to make the event observing code easier to read. If you're not using SwiftNotes, your observers should be written something like this:
let noteCenter = NotificationCenter.default
let queue = OperationQueue.main
// watch for updates
noteCenter.addObserver(forName: .locomotionSampleUpdated, object: loco, queue: queue) { _ in
// do stuff
}
If you want the app to be relaunched after the user force quits, enable significant location change monitoring.
More details and requirements here
- Download or clone this repository
pod install
- In Xcode, change the Demo App project's "Team" to match your Apple Developer Account
- In Xcode, change the Demo App project's "Bundle Identifier" to something unique
- Build and run!
- Go for a walk, cycle, drive, etc, and see the results :)
- To see the SDK in action in a live, production app, install Arc App from the App Store, our free life logging app based on LocoKit