Articles tagged with: Apple

FCPX Creative Summit 2016 Provisional Schedule - More from Apple

Wednesday, 08 June 2016

The provisional schedule for October’s FCPX Creative Summit is now available.

Interesting: Instead of last year’s 90 minute presentation given twice to two groups, the schedule shows a 60 minute ‘General Address’ followed by a choice between 90 minute breakout sessions:

2:00 – 3:00pm General Address: The Future (Apple Campus)
3:00 – 4:30pm Apple Session Breakouts (Apple Campus)

What could Apple be talking about in these sessions which would mean attendees would have to choose one session over an other?

Another point: The Summit was held in late June last year. This year it will be in late October. Given this event is organised to fit in with the plans of the ProApps team, there is a chance there will be more to talk about later this year.

Next week at the WWDC 16 there is a chance that Apple will announce or pre-announce a new version of the Mac Pro, just as they did in 2013. Final Cut Pro X is the application that most people understand needs a lot of power. Perhaps Apple will once again use a Final Cut screenshot during the keynote (which will be streamed online on Monday).

Apple’s structure editing patent

Tuesday, 07 June 2016

While editors wait for the next big Final Cut Pro X update, I hope the Apple ProApps team will implement some of the ideas in their ‘structure editing’ patent. Here’s my old writeup of the patent they applied for in 2009 on fcp.co:

Most people think that the editor’s job is ‘to cut out the bad bits’ in individual scenes. Many are surprised to discover that editors commonly change and improve storytelling by changing story structure. As many film and TV makers consider that structure is very important when it comes to telling stories, I think it is a good idea for video editing software to recognise story structure.

Structure applies to feature films, TV shows, groups of corporate videos on an intranet, legal video depositions, architects’ video proposals or open-ended weekly web series. The more video applications can have these structures encoded in their projects, the better the tools they’ll be able to provide to a wider range of people all over the world.

The Foundry on high-end post production applications for iOS

Wednesday, 10 June 2015

Don't believe Final Cut Pro X or Adobe Premiere will run on iOS one day? Apple’s Metal for iOS might be the key.

Jack Greasely, Head Of New Technology at The Foundry (makers of feature film post tools such as NUKE and MODO) talking to RGB HQ:

As Metal originated in iOS does this mean that there is the potential to run 'serious' applications, such as MODO, NUKE or even MARI on an iPad one day?

Anything is possible. Having a common graphics API between the two is certainly a start. What is maybe more interesting is a WYSIWYG workflow between IOS and OSX. You could use your Mac to design assets in MARI / MODO / NUKE and then have them display / rendering live on a mobile device looking exactly the same.

Using the iPad's accelerometer, Foundry tools might be able to render graphics as AR overlays.

On stage at Apple's WWDC 15

Jack also appeared on stage at the Apple WWDC conference this week - 10:58 into the video at developer.apple.com. He showed how much The Foundry team were able to achieve in four weeks of adding Metal to MODO, their 3D modelling and animation application.

Apple's official new mission: “Leave the world a better place”

Wednesday, 10 June 2015

Apple has changed the corporate definition that they include in their press releases:

Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today, Apple leads the world in innovation with iPhone, iPad, the Mac and Apple Watch. Apple’s three software platforms — iOS, OS X and watchOS — provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store, Apple Music, Apple Pay and iCloud. Apple’s 100,000 employees are dedicated to making the best products on earth, and to leaving the world better than we found it.

Pity the grammar is a little off. Shouldn't it be "leaving the world better than they found it"?

Compare this new definition with the way Apple described itself last week - which had remained unchanged for over three years - since January 2012: 

Apple designs Macs, the best personal computers in the world, along with OS X, iLife, iWork and professional software. Apple leads the digital music revolution with its iPods and iTunes online store. Apple has reinvented the mobile phone with its revolutionary iPhone and App Store, and is defining the future of mobile media and computing devices with iPad.

Apple's mission in 1995:

Apple Computer, Inc., a recognized pioneer and innovator in the information industry, creates powerful solutions based on easy to use personal computers, servers peripherals, software, online services and personal digital assistants. Headquartered in Cupertino, California, Apple Computer, Inc. (NASDAQ: AAPL) develops, manufactures, licenses, and markets products, technologies, and services for the business, education, consumer, scientific & engineering and government markets in over 140 countries.

A corporate definition that could apply to almost any tech company back then - apart from the mention of PDAs.

I've written previously about how this definition changed between 1995 and 2012

Credit to noticing Monday's change goes to UK-based Mac journalist Lucy Hattersley.

Apple WWDC 2015 and post production

Tuesday, 09 June 2015

Here's my take on the announcements at the Apple Worldwide Developer's Conference 2015.

OS X El Capitan

First came details on the next version of OS X, named El Capitan. El Capitan is one of the mountains in the Yosemite National Park in California. This naming is smiliar to how OS X Mountain Lion came after OS X Lion and OS X Snow Leopard came after OS X Leopard. It signals that this update isn't as big from users point of view. OS X Mountain Lion, Snow Leopard and El Capitan have less big new features that most Mac users will get excited about.

These updates give developers that chance to catch up on new Apple technologies and Apple the chance to introduce innovations that developers can use to do new things. An example could be that if Apple have added more features to AV Foundation, the code that lets application developers (inside and outside of Apple) to do more with movies and audio. Apple Marketing won't tell the public about AV Foundation updates this year, but will hope that new OS X (and iOS watchOS) abilities will mean better AV applications in the coming months.

Spotlight improvements

El Capitan adds natural language searches "The pictures I took last year in London." This should also work for searches based media metadata, and eventually metadata added within applications such as Final Cut Pro X: "Select favourites from the second half of the concert in Manchester featuring the drummer and the bass guitarist shot on a GoPro." Apple haven't yet announced links between media metadata and Spotlight searches, but natural langauge searches in iMovie and Final Cut Pro X would be useful.

In iOS apps can make their content available to Spotlight so that an iPhone- or iPad-wide search will find content in a specific part of the app.

Metal

[20:30 into main keynote] Last year Apple introduced Metal - a way for iOS gaming applications to better access the power of iPhone, iPad and iPad CPUs and GPUs. Last year the emphasis was on how this would make iOS games better. This year Apple had a demo of how well a game worked with Metal on OS X.

Metal has also evolved to speed up more of OS X. In El Capitan Metal improves the speed of Core Animation and Core Graphics. Compared with when these libaries executing OpenGL commands, they now render 'up to' 50% faster on the same hardware.

[21:20] Interestingly for post production people, Apple also said how Metal would speed up 'high performance apps.' It does this by replacing OpenGL graphics code and OpenCL distributed processing code (for sharing work between CPUs).

[21:45] The first developer story of the WWDC keynote was from Adobe. They've been able to speed up After Effects CC rendering by 8x using Metal for OS X. Animations can be rendered in real time. Instead of waiting for Illustrator CC to rerender complex graphics when zooming, now rendering happens in real time. This brings the power of interactive graphic changes - no waiting for rendering in Illustrator.

“We are committed to adopting Metal on our OS X apps. With performance increases of up to 8x, we are excited about what Metal can do for our Creative Cloud users.” - David Wadhwani, Sr. VP & GM, Digital Media, Adobe [22:08] 

David McGavran of Adobe Systems demoed the speed improvements in After Effects CC and Illustrator CC during the ‘Platforms State of the Nation’ session [1:32:15 into this video]. He said that Adobe apps like Premiere Clip already benefit from Metal on iOS.

“Pro app makers are seeing the benefits of Metal like The Foundry and Autodesk. I think were going to see pro users, gamers and all of us benefiting from the performance advantages of Metal” Craig Federighi, Apple [26:55]

AV Foundation

AV Foundation is the part of OS X (and iOS) that applications use to manipulate video and audio. The Editing Movies in AV Foundation developer session has the following description:

Learn how to use the new AVMutableMovie class to modify media files and simplify your editing workflows. See how to support segment-based editing and discover the power of sample reference movies.

The developer documentation for the version of AV Foundation in El Capitan hasn't yet been updated to include AVMutableMovie.

Speed improvement

According to Pedro Santamaría on Twitter, the current version Final Cut Pro X runs faster on his 2012 MacBook Air - as tested using my BruceX benchmark:

This is impressive given that operating system betas aren't tuned for speed. I'll add any update he gives on how much faster the Mac Pro is running El Capitan.

Also…

In each keynote Apple likes to show slides that list ‘too many features to go into right now.’ Some that are relevant to post production are:

File copy resume - could mean that the Finder (or other applications) will resume copying files after a crash or other interruption

Photos editing extensions - could be possible to make changes to photos within video and motion graphics applications. No ‘Movie editing extentions’ yet

Airplay Video - OS X users can already play videos on Apple TVs on the same network, perhaps this mention means that other Macs will be able to play back video.

Should I install OS X El Capitan?

No. Not today.

Unless you are developing Mac software. Although Apple hope it won't cause any problems on your Mac, it wouldn't suprise exprienced developers if a fault wipes all hard drives. At the moment there are reports ranging from "No problems" to "Final Cut Pro X crashes constantly." If you must try it, I suggest you wait for the version Apple releases as part of its public Beta programme.

As regards compatibility, if a Mac can run OS X Yosemite today, it will be able to run the release version of OS El Capitan tomorrow.

Apple ecosystems

Other keynote announcements show that Apple want to maintain and create new ecosystems. As well as supporting big players, their ecosystems include support for small companies and individuals to do well. This makes sense to iOS and OS X developers selling through Apple's App Stores.

Apple Pay

As well as adding more banks and the UK to Apple Pay, Apple mentioned that Square will soon introduce a terminal that will allow anyone to accept ApplePay payments.

News

The News iOS app is a place for syndicated content from news and media organisations. Apple is also considering content from smaller sites and individuals. For now they need to set up an RSS feed of their stories and apply to Apple stating which kinds of content they create:

News brings together high-quality news, magazine, and blog sources in a single beautiful content experience. Whether you’re a major news organization or an individual blogger, you can sign up to deliver your content to millions of iOS users.

...

Topics are created and assigned by Apple’s expert editors and sophisticated algorithms.

News Publishing Guide - Apple

This means that if you can demonstrate that you provide relevant content on a subject of interest to just a few thousand people, Apple's News app might be able to help you connect with the iOS users amongst them. 

As well as being able to monetise your content with 100% from any advertising you include, you optionally get 70% of income from Apple's iAd system.

Apple Music

During the launch of Apple Music, Apple made a point of including unknown musicians. As well as being able to have their music included in Apple Music, Connect helps them maintain their community of fans by adding text, audio, pictures and video to their Apple Music page. Apple Music also takes into account how individuals within families have different music preferences by offering a good value family plan.

At the moment Apple’s Beats 1 worldwide radio station seems aimed at a limited demographic - those interested enough in current and new music to want to hear well chosen music. Those willing to pay for a music subscription. Hopefully Apple will be able to create more advanced radio experiences in future.

Future media ecosystems

This prompts the question of where video, TV and film fits into Apple's plan. If Apple is consistent with what they are about to do with News and Music, people, small groups and large content creators will be able to share their video content in the same way.

If Apple Movies was built in the same way, there would be an iOS application which would provide a single place to consume and discover video content. It would combine human curation with algorithms that would learn your preferences. If you are a producer, Apple would provide simple tools to make your content available (News) and build audiences (Apple Music Connect).

A similar ecosystem could be built around podcasts - perhaps supported by a worldwide Apple radio station that features presenters and excerpts from podcasts, audio books and radio drama.

The Apple Music family plan prompts me to point out that some media - music, TV, movies - is fun to share with others. Perhaps Apple should find a way for software to create combined streams that would entertain groups of people: A family playlist for everyone until 10pm, then content for the parents. "Stick with this 15 minute short that only your brother likes, something you really like will be on next." This could work for any group of people - including groups not gathered in one place: hanging out across the internet.

If your media has to fit in a shared customised stream, the methods you use to tell stories might change.

If Google and Apple will eventually meet in a battle of software on hardware vs. software in the cloud, Apple might need to change the field of battle. If hardware devices become so ambient as not needing to be associated with an individual - apart from an earpiece running Siri - Apple's hardware integration edge will become irrelevant. What survives will be Apple's ability to maintain and support media ecosystems. 

Visit the Apple Campus for a Final Cut Pro X presentation on June 26

Thursday, 04 June 2015

It seems that after years of very little access, Apple is opening up a little more. On June 26 members of the public will be visiting Apple's offices to get an update on Final Cut Pro X. The kind of access that usually granted only to a favoured few is available to attendees of Future Media Concepts' FCPX Creative Summit:

FCPX Creative Summit attendees have the unique opportunity to visit the Apple Campus in Cupertino and hear directly from FCPX product managers! You’ll get a unique perspective on how this video editing software has changed the industry and how it continues to innovate today.

Get an update from Apple Product Managers on the current release of Final Cut Pro X, exciting customer stories, and the thriving ecosystem of third-party software and hardware.

Representatives of Apple's ProApps team have appeared at professional events over the years, but this event marks the first time a large group of professionals have been invited to visit Apple.

Future Media Concepts is a company that runs training courses in media production in the USA, Canada and online. They also organise post production events such as the Editors Retreat, After Effects World and the Creative Cloud Masters conference.

Livinia Smith, Future Media Concepts' event marketing manager for the FCPX Creative Summit says that after running events for Adobe and Avid users for many years, recent improvements in Final Cut prompted them to turn to Apple's software. The weekend of June 26-28 is just over four years since Final Cut Pro X was launched. Did that factor into the timing? "Future Media Concepts approached Apple about hosting an event dedicated to this platform. We both decided the date for the conference" says Smith.

Smith went on "Regarding the visit to the Apple Campus, when we pitched the idea to Apple, they saw value in directly interacting with this community of FCP users and they agreed to host a talk with the conference attendees in a lecture room at Apple."

Peeking out over the parapet of a besieged castle

Although Final Cut Pro X and its companion applications Compressor and Motion have been very successful over the years, Apple hasn't seen the need to publically involve itself with the user community. Compare their activities with those of Adobe and Avid - companies whose video editing applications were the traditional competitors of Final Cut Pro.

As well as constantly updating their websites with Premiere Pro and Media Composer case studies, their online activities include blog posts, tweets and Facebook updates with named staff members. They run support forums that feature contributions from software engineers. If a small user group somewhere in the USA gets in touch with Adobe to say they're organising a meeting about Premiere Pro, there's a good chance product manager Al Mooney will appear to give an entertaining presentation on his baby.

In recent years parts of Apple have been interacting a little more with the wider world. For example last year's launch of Swift, a new programming language for developing OS X, iPhone and now Watch apps was a big surprise. Apple going on to launch a programming blog on Swift is even more of a surprise. 

Anyone who visits the online forums discussing Adobe Premiere Pro CC, Avid Media Composer and Final Cut Pro X know that the harshest critics of most applications are those who use them every day for their livelihood. The combination of a long-established culture of Apple not sharing much information and the rabid nature of online power user debate means that it will be hard for the Final Cut Pro X team to change how they interact with the wider Final Cut community.

On the way towards a professional application community

Hopefully the ProApps team will be able to more directly support a Final Cut Pro X community. Online support would include

  • A buyers guide for third-party hardware and software
  • A consultants network
  • Continually updated training materials
  • A job board for employers and job seekers
  • Forums and discussion groups where the developers of the application itself can take part
  • Regular conferences so people can learn from each other and network

The majority of Final Cut users are individuals don't need to set up complex workflows and never need to call on consultants. However, knowing that there is a robust community standing by makes trying a new complex application that bit less daunting.

Although this kind of community might seem at odds with the way Apple works, they have a model of their own they can look to: FileMaker. FileMaker is Apple's professional database system. The FileMaker website has all the features I listed above.

It is interesting that Apple refers to FileMaker as a platform - as it is made up of an authoring tool, a server product and software that runs on Macs, PCs, iOS devices and in web browsers.

Perhaps the ProApps applications might end up as a platform/ecosystem too. I hope June's FCPX Creative Summit is a step on the way.

Disclosure: I'm happy to say I'm presenting a couple of sessions on Apple Motion at the Summit.

 

Apple creative apps architect Randy Ubillos speaking in LA and San Jose

Wednesday, 13 May 2015

The Los Angeles Creative Pro User Group has announced that ex-Apple employee Randy Ubillos will be speaking at public events in May and June. 

Until April 23rd Randy Ubillos was a very important member of Apple's application software team:

His influence on Mac software started years before he joined Apple. He developed the first versions of the Adobe Premiere video editing software. Since joining Apple he's worked on Final Cut Pro, iMovie and iPhoto amongst others.

On May 27, 2015 he will be appearing at the May LACPUG meet in Los Angeles. On June 26, he will be appearing at the Bay Area SuperMeetUp - a similar event in San Jose.

It isn't common for ex-Apple employees to talk publically about areas of expertise they covered while working at Apple. Especially so soon after leaving the company. I guess this is either very bad news or very good news. The negative explanation is that Randy resigned because his vision for the future of Photos, iMovie, Final Cut Pro X and other applications he was involved with was too different from Apple's plans. His resignation was interpreted by some as a sign that Apple are about to give up on their professional applications - including Final Cut Pro X, Motion, Compressor and Logic Pro X. The bad news would be that Randy feels embittered enough to almost immediately go public with problems at Apple.

The 'good news' interpretation is that Randy appearing in public is part of Apple loosening up - that they understand that it is a good idea if users understand more about the people and motivations behind Apple software.

The good news is that the agenda at the LACPUG website says that Randy will be talking about his enthusiasm for the idea of telling stories with video: 

Randy will speak about his own moviemaking experiences and the power of video to inspire and document our lives. He will also provide tips and tricks for making your own movies.

That kind of talk could be designed to establish his bona fides for a new passion project supporting video literacy. A good sign is that he will also be joining post production experts to answer film making questions in a 'Stump the Gurus' session.

There's no sign that he'll be 'dishing the dirt on' or revealing Apple secrets about Final Cut Pro X, Photos and Aperture. Mike Horton of LACPUG specifically tweeted:

However, the fact that Randy is speaking in public so soon after leaving Apple is a good sign.

Worldwide NLE Market until 2020: It's all about price

Thursday, 26 February 2015

A new report by market researchers Frost & Sullivan on the global non-linear editing market makes some interesting points and raises some interesting questions.

[The report] observes the NLE market through the lenses of the broadcast, post-production, and professionals segments during 2013–2020

The headline: Product Affordability Drives Sales of Pro-Video Non-Linear Editing Solutions Worldwide 

Here are some of their highlights:

  • With a steady pricing decrease since 2003, the professional editing solutions market opened up to a large number of users across all segments. Eventually, the availability of products at $1000 created a large user base of individual professionals who typically owned small boutique studios or small production houses.
  • The downward price spiral, the intensifying competition, and the slow growth in sales in the broadcast and post-production segments are challenges to the growth of high-margin, high-end NLE products.
  • The NLE market is preparing itself for innovation around cloud-based solutions, which are likely to challenge the adoption of off-the-shelf products.
  • The fast-growing and highly fragmented consumer devices market makes multi-platform content delivery a key requirement driving growth in the broadcast segment. In the post-production segment, adoption of solutions that help in increasing operational efficiency and reducing bottom- line while turning around content quickly is key.  
  • The main vendors in this space are Adobe, Apple, Quantel, Avid, and Grass Valley. Long-term growth in the pro-video segments (broadcast and post-production) is expected to be determined by a vendor’s ability to innovate, provide constant upgrades, and create an easy yet holistic ecosystem around the editing products. The ability to integrate into collaborative workflows and ensure interoperability will also be critical.

In return for your contact information, you can download a preview of the report from the Frost & Sullivan site.

According to the preview, a couple of questions the full report answers are:

  • Are the existing competitors correctly structured to meet customer needs? Will competing companies and products continue to exist or will they be acquired by other companies? Will these products become features in other markets? 
  • What technical trends, including cloud technologies, are shaping the marketplace? What trends are on the horizon, and what does this mean for future product strategy? 
  • How will the pro-video segments, such as broadcast and post-production, compare with the professionals segment that largely comprises individual video creators?

For those of us who can't justify buying these kinds of reports, perhaps discussing these questions might be useful. For example, how good are Apple, Adobe, Avid, Grass Valley and Quantel at

  • Innovation
  • Providing constant upgrades
  • Creating an easy yet holistic NLE ecosystem
  • Integration into collaborative workflows
  • Interoperability

Apple's OS development cycle and Final Cut Pro X

Tuesday, 10 February 2015

Most people who use Apple's editing software applications have no inkling of how software is developed. For the many who have strong opinions about what features are missing from and doesn't work well in Final Cut Pro X, it is worth taking the time to consider Apple's software development cycle.

Apple is more tight-lipped than most when it comes to talking about how they build products and services, but a recent series of podcast episodes are a good place to go. The Debug podcast from iMore covers tech software and services for computers, tablets and phones. Episode 60 features a conversation with people who were involved in making multiple versions of OS X and iOS at Apple over the years.

It is worth understanding what they say about deadlines and development when it comes to OS X and iOS, because it is likely similar rules apply when it comes to Final Cut Pro X development. The main difference is that of timeframes. The podcast interviewees talk about how little time they have for actual development when their OSs are being updated every year. Due to fixing bugs found in the current version and the next version, most features have to be implemented in two and a half months out of every 12 (The longest time between major updates of Final Cut Pro X has been 14 months: between 10.0.6 and 10.1).

[43:42] It's not a yearly cycle. Part of the cycle of what you're going to do for one release happens during the previous release. Let's say you're going to do announcements in June [at the worldwide developer conference] and you're going to bring out a product (whether it's iOS or OS X) in September/October… You start figuring part of that out in the spring time of the previous cycle. Because you have a set of features where it's 'do or die time:' you've passed that two and a half months of typical development time in the previous December, January or February. Everyone's got to work out what features are going to make it …

[45:22] You have a big spreadsheet [at the start of March] which has everything and you decide what's going into that release. [Higher ups would look at each feature and say…] "Not this time, we're gonna bump that." Sometimes there were crushed [software] engineers, sometimes there were completely relieved engineers… that gets bumped to another list. Basically everyone starts at that point to start panicing about bugs and gets into a dead run for that release. Once the new one comes out, everyone says "What the hell have we got" - that's one bucket we draw from: what got deferred. Another bucket is 'what did we make up in the meantime?' [new ideas for features thought up between March and August] and there's the bucket of things we need to do because people are pestering us about them… 

[47:27] It's usually a month before it goes out the door that the product management team and the product marketing team start gathering the stuff up together for a big meeting, usually in October or November and then you decide what that release is going to be…  

[48:19] …they would decide what the 'theme' of the release would be… 

[48:40] …you really start some time in December getting on with it… 

[50:19] …there's the list of features that project management is tracking […which are part of…] 'tentpoles' - the small number of features that define the release … top down decisions informed by 'bottom-up' feedback: "have you seen what Android has introduced, the Palm Pre is going to have such and such"

[56:01] From the moment the feature list is defined, until that release hits, everybody and their brother and their mother is trying to come in with another feature… 

[The trick is to match the amount of new features you want to implement with the amount of resource you have to implement the features - you have to say "no"]

[1:01:33] …progress reports start trickling in at the end of January, sometimes February. There's a big feature review in March … [some features aren't going to be ready for August, so need to be dropped] "Oh my god, we've just kicked out one of the tentpoles, oops, we've just kicked out another tentpole. The tent's collapsed we've got to come up with something else" […] That's where things get really 'exciting' [unpleasant].

 

This conversation shows that even Apple, the richest company in world, has to limit what it does when devloping software - even when that software is for iPhones, the hardware that accounts for 67% of Apple's profits. 

PS: Brooks' Law: 'adding manpower to a late software project makes it later'

Apple drops ProApps from corporate definition

Monday, 15 September 2014

For over 10 years Apple have included a mention of their professional video and audio applications in their corporate definition. Like most companies they define who they are in every press release they put out.

This week they dropped the words 'professional applications' from their definition:

Apple reinvented the mobile phone with its revolutionary iPhone and App Store, defined the future of mobile media and computing devices with iPad and has announced Apple Watch, its most personal device ever. Apple leads the digital music revolution with its iPods and iTunes online store, continues the rapid pace of innovation of mobile software with iOS and integrated services including Apple Pay and iCloud. Apple designs Macs, the best personal computers in the world with OS X, and free iOS and OS X apps like iWork and iMovie.

The first mention was in July 2004:

Apple ignited the personal computer revolution in the 1970s with the Apple II and reinvented the personal computer in the 1980s with the Macintosh. Today, Apple continues to lead the industry in innovation with its award-winning desktop and notebook computers, OS X operating system, and iLife and professional applications. Apple is also spearheading the digital music revolution with its iPod portable music players and iTunes online music store.

Apple's previous nine definitions: 1995-2012

 

 

How many copies of Final Cut Pro? Apple’s numbers

Monday, 25 August 2014

Creative COW forum member Franz Bieberkopf has done some interesting research and rounded up the numbers when Apple have announced how many copies of Final Cut Pro have been sold over the years:

Though I think user numbers are of limited value, it has become a bit of an interest to me (particularly in light of how secretive and vague the various developers tend to be, and in light of the sometimes outrageous claims here). I dug into past announcements from Apple in order to sketch the shape of the numbers that we do know (even including a graph!), and thus the growth curves over the past 15 years.

Read more at the Apple FCPX or Not: The Debate forum at Creative COW.

 

Televisual Production Technology Survey 2014: Editing Software

Thursday, 21 August 2014

As part of a survey of '100 senior production staff' Televisual asked about what post-production software they use.

Televisiual2014

 

Great news for Avid.

Who will step up?

Although this looks like bad news for Final Cut Pro X fans, I'm surprised it is used by ten of those surveyed. 

…the FCPX upgrade which alienated many users. “We always edited FCP until Apple produced a useless upgrade version,” says one indie head of production.  Respondents score FCPX poorly in terms of workflow, support and feature set – but highly in terms of price.

If post companies find X is too limited 'in terms of workflow, support and feature set', then fewer companies will being using it next year.

Would higher usage amongst this group of 100 companies result in bigger sales to the many professionals who would find Final Cut Pro X useful? If X gets more high-end features that would be a sign the business of these 100 execs matters to Apple.

It also falls to third parties to provide better workflow consultancy and support options - if they still think there's a large enough potential market for Final Cut.

If you think what they said is relevant to your buying choices, go to the Televisual site to see what the 100 said about compositing, grading, 4K and cameras

Thanks to the MotionVFX mBlog for pointing me in the direction of this survey.

 

Apple patent: Media compilation generation

Tuesday, 19 August 2014

Apple have been assigned a patent concerning the creation of video compilations based on individual preferences. 

The present disclosure is directed to a online video parsing application for parsing one or more videos and for performing one or more actions on particular portions of the videos. For example, the online video parsing application may identify portions of interest within particular videos. A "portion" of a video refers to at least a subset of content included within the video, and may be designated by a time interval. The video parsing application may process any type of video, and any type of video portion. The online video parsing application as discussed below may be implemented using any suitable combination of software, hardware, or both.

In some embodiments, the online video parsing application may create a compilation of video content. Compiling will be understood to mean concatenating, or arranging in series, videos or portions of videos thereby forming a new video. Compiling video, portions of video, or combinations thereof, may provide a technique for delivering desirable content to a user. In some approaches, compilations may be generated based on user input, may be manually assembled by a user, or both. For example, a user may specify content of interest by manually selecting portions of online videos. The user may also input keywords, preference information, any other suitable indicators, or any combination thereof to the online video parsing application for searching video content. In some approaches, the online video parsing application may generate compilation videos using, for example, information provided by the user, automated processes, or both.

8812498-compilations-content

The idea depends on tagging parts of online content. This could be done by their creators, third parties, or by software. Individuals could profit from being curators who discover and tag content well.

Users would be able to specify how long they want their compilation to be: from a few minutes to a continuous feed. 

I've written before about how iTunes Radio could more than a service that plays music content, but to make a custom radio station based on the full range of content a person might find interesting. Looks like Apple will be able to do this with other forms of media.

PS

The patent refers to 'online video' as 'podcast.' As podcasts can be audio or video podcasts, I wrote this post replacing the word 'podcast' with 'video' or 'online video.' One trick when applying for patents is to get protection for ideas in such a way that the competition don't think the idea applies to them.

Those who make the best compilations win!

 

 

What’s next for Mac Pro graphics cards?

Wednesday, 13 August 2014

If Apple update the Mac Pro this year, there's a very good chance that as well as introducing faster CPUs, they'll offer faster graphics cards. The FirePro D300, D500 and D700 in last year's Mac Pro are manufactured by AMD. AMD have spent the last few months updating the FirePro cards they make for PCs. The specifications of these new cards show how much more AMD can do for the same money.

The cards in the MacPro are custom made for Apple, but there are some rough equivalents between the D-series and the PC W-series. For example the D300 has similar specifications to the W7000. 

The W7000, W8000 and W9000 first appeared in 2012. The PC equivalents of the D300, D500 and D700. This year Apple may base their new Mac Pro GPU cards on more recent AMD cards.

Here is a table edited together from tables on the AnandTech site. The table is divided into three groups - representing low-, medium- and high-end Mac Pro options. Each group shows the original AMD card, the Apple-specified Mac Pro card and the 2014 update of the 2012 PC card.

CPU-cards-table-a

Click to see more detail

At each level AMD have at least doubled the VRAM, added 40% more stream processors. The W8100 and W9100 have wider memory buses (so more information can be transferred for each command) and many more transistors.

Although Apple can specify any number of stream processors, clock speeds or VRAM, these more recent cards show what AMD considers is the low-, medium- and high-end when it comes to PCs. For Mac owners perspective, they show how much card for a similar amount of money AMD can now make compared with the cards in the Mac Pro and 2012.

Find out about the W7100W8100 amd W9100 by reading more at AnandTech

 

AppleCare Professional Video Support no longer a separate service

Saturday, 09 August 2014

UPDATE: The previous version of this post said that Apple have discontinued AppleCare Professional Video support. In fact support for video, audio and Xsan is still available as part of AppleCare OS Support. They are no longer available as standalone products.

Perhaps Apple weren't getting much takeup of their Video, Audio and Xsan support services as individual products. Hopefully they want to encourage third parties to establish services based on specific areas of expertise.

Old price of AppleCare Pro Video Support: $799/year. Price for AppleCare OS Support starts at $5,995/year.

Good news for FCPWORKS in LA and NMR in London?

(August 13 update: Sam Mestman of FCPWORKS has responded to this post)

Here's the original version of this post for reference:

video-finalcut-over

Apple has removed references to its AppleCare Professional Video Support service from its website. Also Professional Audio and Xsan Support are gone.

Here's how the Video Support product was described:

AppleCare Professional Video Support is perfect whether you are editing HD video, or designing motion graphics. Because Apple builds the entire video editing solution — from hardware to software to the operating system — one phone call to AppleCare can address most of your technical needs, providing integrated support that you can’t get anywhere else.

It's product page is now missing. Here is the archive copy made by the Internet Archive WayBackMachine on Wednesday 6th August.

Other professional AppleCare services are still available. On Wednesday that page also linked to Professional Video, Audio and Xsan support.

no-longer-available

Video, Audio and Xsan AppleCare Pro products are no longer available on the online Apple Store (Archive).

Interesting move for Apple. 

 

 

Apple patent: Metadata generation from nearby devices

Tuesday, 29 July 2014

Today Apple was awarded a patent for a process where when data is created or saved on a device, the device detects nearby devices ('second devices') and offers possible metadata tag options that could be associated with the data:

Identifying the content can include identifying the content that has just been created (e.g., identifying a digital picture right after the picture is taken), or selecting from a list of content that was created when at least one of the second devices was in transmission range with the first device. In the latter case, each content file can be associated with a list of second devices that were present. The user of the first device can have the options of labeling the file with a particular second device or a group of second devices (e.g., multiple labels can be assigned to each file).

The content can have a variety of formats. For example, the content can be a text file (e.g., a text note), a data file (e.g., a spreadsheet), a multimedia file (e.g., a sound recording, a digital image, or a digital movie clip), or in any other format (e.g., a voice mail message, an entry in a data sheet, a record in the database, etc)

For OS X and iOS 8 users, the metadata would appear as tags associated with a file, calendar event, contact or note. For Pro Apps users the metadata would appear as keywords associated with stills, audio and video clips recorded on iOS, OS X and other devices.

Those controlling public devices such as iBeacons could also offer up useful metadata for those creating content in public spaces.

 

Apple's video conferencing patent

Tuesday, 29 July 2014

Apple has been awarded a video conferencing patent for connecting multiple cameras in one location:

Multiple cameras are oriented to capture video content of different image areas and generate corresponding original video streams that provide video content of the image areas.

One part seems to overlap with the way Google Hangouts works:

An active one of the image areas may be identified at any time by analyzing the audio content originating from the different image areas and selecting the image area that is associated with the most dominant speech activity.

The illustrations are interesting - ranging from a TARDIS-like desk to a vertical video display (showing that the video feeds could be sent to devices people view in portrait mode).

Tardis

 

 

 

Apple Pro Apps Revenue: 2005-2014

Monday, 28 July 2014

For many years most users of Apple's 'Pro Apps' assumed that they were a loss leader for high-end Macintoshes. Very light copy protection meant that many students pirated the software, but that wasn't a problem for Apple. 99.9% of Final Cut Pro users needed a Mac to run the software.

These days Final Cut Pro X and Logic Pro X users may love their software, but many are worried that Apple might discontinue them at any moment. Pro Apps are such a small contributor to Apple's bottom line, and the transition from Final Cut Pro 7 to Final Cut Pro X has left many people gun-shy. They don't want to bet their livelihood on an ecosystem that Apple may abandon at a whim.

Here's a look at how much revenue Apple probably gets from selling Pro Apps, some worries might be alleviated.  

Apple's metadata propagation patent

Tuesday, 22 July 2014

Apple has been awarded a patent that says that metadata propagation rules can be included with video files. That means you could pass on a video file with metadata that would be available to an editor but not exported when the generate new content based on the files you sent them:

Some embodiments provide a method for processing metadata associated with digital video in a multi-state video computer readable medium. The method specifies a set of rules for propagating the metadata between different states in the video computer readable medium. It then propagates the metadata between the states based on the specified set of rules.

It also describes an example when the metadata in one set of video clips can be assigned to a related set of clips stored elsewhere. This would apply if an on-set assistant had added metadata to lo-res H.264 clips on an iPad and an editor wanted some of the metadata applied to the media from the professional cameras.

It also says that the metadata could also define which parts of the high-quality media should be captured later:

In some embodiments, the method recaptures digital video from a first storage, when at least a portion of the digital video is also stored in a second storage. The method retrieves the digital video from the first storage. It identifies a set of metadata that is stored for the digital video in the second storage, and then determines whether there is an associated set of rules for processing this set of metadata when the digital video is re-captured from the first storage. If so, the method then stores the set of metadata with the retrieved digital video in a third storage.

 

side effects of tren acetate check this site out www.showpig.com stanazol dosage trenbolone 250 go to the website what is dianabol used for