Apple patent: Metadata generation from nearby devices
Today Apple was awarded a patent for a process where when data is created or saved on a device, the device detects nearby devices (‘second devices’) and offers possible metadata tag options that could be associated with the data:
Identifying the content can include identifying the content that has just been created (e.g., identifying a digital picture right after the picture is taken), or selecting from a list of content that was created when at least one of the second devices was in transmission range with the first device. In the latter case, each content file can be associated with a list of second devices that were present. The user of the first device can have the options of labeling the file with a particular second device or a group of second devices (e.g., multiple labels can be assigned to each file).
The content can have a variety of formats. For example, the content can be a text file (e.g., a text note), a data file (e.g., a spreadsheet), a multimedia file (e.g., a sound recording, a digital image, or a digital movie clip), or in any other format (e.g., a voice mail message, an entry in a data sheet, a record in the database, etc)
For OS X and iOS 8 users, the metadata would appear as tags associated with a file, calendar event, contact or note. For Pro Apps users the metadata would appear as keywords associated with stills, audio and video clips recorded on iOS, OS X and other devices.
Those controlling public devices such as iBeacons could also offer up useful metadata for those creating content in public spaces.