Optimizing Media Production and Archival Workflows | Backblaze https://www.backblaze.com/blog/category/cloud-storage/media-workflow/ Cloud Storage & Cloud Backup Thu, 07 Mar 2024 02:00:16 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://www.backblaze.com/blog/wp-content/uploads/2019/04/cropped-cropped-backblaze_icon_transparent-80x80.png Optimizing Media Production and Archival Workflows | Backblaze https://www.backblaze.com/blog/category/cloud-storage/media-workflow/ 32 32 How to Run AI/ML Workloads on CoreWeave + Backblaze https://www.backblaze.com/blog/how-to-run-ai-ml-workloads-on-coreweave-backblaze/ https://www.backblaze.com/blog/how-to-run-ai-ml-workloads-on-coreweave-backblaze/#respond Wed, 13 Dec 2023 16:37:55 +0000 https://www.backblaze.com/blog/?p=110526 At Backblaze's 2023 Tech Day, CoreWeave and Chief Technical Evangelist Pat Patterson discussed how the two platforms interact. Read the article to see the details.

The post How to Run AI/ML Workloads on CoreWeave + Backblaze appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
A decorative image showing the Backblaze and CoreWeave logos superimposed on clouds.

Backblaze compute partner CoreWeave is a specialized GPU cloud provider designed to power use cases such as AI/ML, graphics, and rendering up to 35x faster and for 80% less than generalized public clouds. Brandon Jacobs, an infrastructure architect at CoreWeave, joined us earlier this year for Backblaze Tech Day ‘23. Brandon and I co-presented a session explaining both how to backup CoreWeave Cloud storage volumes to Backblaze B2 Cloud Storage and how to load a model from Backblaze B2 into the CoreWeave Cloud inference stack.

Since we recently published an article covering the backup process, in this blog post I’ll focus on loading a large language model (LLM) directly from Backblaze B2 into CoreWeave Cloud.

Below is the session recording from Tech Day; feel free to watch it instead of, or in addition to, reading this article.

More About CoreWeave

In the Tech Day session, Brandon covered the two sides of CoreWeave Cloud: 

  1. Model training and fine tuning. 
  2. The inference service. 

To maximize performance, CoreWeave provides a fully-managed Kubernetes environment running on bare metal, with no hypervisors between your containers and the hardware.

CoreWeave provides a range of storage options: storage volumes that can be directly mounted into Kubernetes pods as block storage or a shared file system, running on solid state drives (SSDs) or hard disk drives (HDDs), as well as their own native S3 compatible object storage. Knowing that, you’re probably wondering, “Why bother with Backblaze B2, when CoreWeave has their own object storage?”

The answer echoes the first few words of this blog post—CoreWeave’s object storage is a specialized implementation, co-located with their GPU compute infrastructure, with high-bandwidth networking and caching. Backblaze B2, in contrast, is general purpose cloud object storage, and includes features such as Object Lock and lifecycle rules, that are not as relevant to CoreWeave’s object storage. There is also a price differential. Currently, at $6/TB/month, Backblaze B2 is one-fifth of the cost of CoreWeave’s object storage.

So, as Brandon and I explained in the session, CoreWeave’s native storage is a great choice for both the training and inference use cases, where you need the fastest possible access to data, while Backblaze B2 shines as longer term storage for training, model, and inference data as well as the destination for data output from the inference process. In addition, since Backblaze and CoreWeave are bandwidth partners, you can transfer data between our two clouds with no egress fees, freeing you from unpredictable data transfer costs.

Loading an LLM From Backblaze B2

To demonstrate how to load an archived model from Backblaze B2, I used CoreWeave’s GPT-2 sample. GPT-2 is an earlier version of the GPT-3.5 and GPT-4 LLMs used in ChatGPT. As such, it’s an accessible way to get started with LLMs, but, as you’ll see, it certainly doesn’t pass the Turing test!

This sample comprises two applications: a transformer and a predictor. The transformer implements a REST API, handling incoming prompt requests from client apps, encoding each prompt into a tensor, which the transformer passes to the predictor. The predictor applies the GPT-2 model to the input tensor, returning an output tensor to the transformer for decoding into text that is returned to the client app. The two applications have different hardware requirements—the predictor needs a GPU, while the transformer is satisfied with just a CPU, so they are configured as separate Kubernetes pods, and can be scaled up and down independently.

Since the GPT-2 sample includes instructions for loading data from Amazon S3, and Backblaze B2 features an S3 compatible API, it was a snap to modify the sample to load data from a Backblaze B2 Bucket. In fact, there was just a single line to change, in the s3-secret.yaml configuration file. The file is only 10 lines long, so here it is in its entirety:

apiVersion: v1
kind: Secret
metadata:
  name: s3-secret
  annotations:
     serving.kubeflow.org/s3-endpoint: s3.us-west-004.backblazeb2.com
type: Opaque
data:
  AWS_ACCESS_KEY_ID: <my-backblaze-b2-application-key-id>
  AWS_SECRET_ACCESS_KEY: <my-backblaze-b2-application-key>

As you can see, all I had to do was set the serving.kubeflow.org/s3-endpoint metadata annotation to my Backblaze B2 Bucket’s endpoint and paste in an application key and its ID.

While that was the only Backblaze B2-specific edit, I did have to configure the bucket and path where my model was stored. Here’s an excerpt from gpt-s3-inferenceservice.yaml, which configures the inference service itself:

apiVersion: serving.kubeflow.org/v1alpha2
kind: InferenceService
metadata:
  name: gpt-s3
  annotations:
    # Target concurrency of 4 active requests to each container
    autoscaling.knative.dev/target: "4"
    serving.kubeflow.org/gke-accelerator: Tesla_V100
spec:
  default:
    predictor:
      minReplicas: 0 # Allow scale to zero
      maxReplicas: 2 
      serviceAccountName: s3-sa # The B2 credentials are retrieved from the service account
      tensorflow:
        # B2 bucket and path where the model is stored
        storageUri: s3://<my-bucket>/model-storage/124M/
        runtimeVersion: "1.14.0-gpu"
        ...

Aside from storageUri configuration, you can see how the predictor application’s pod is configured to scale from between zero and two instances (“replicas” in Kubernetes terminology). The remainder of the file contains the transformer pod configuration, allowing it to scale from zero to a single instance.

Running an LLM on CoreWeave Cloud

Spinning up the inference service involved a kubectl apply command for each configuration file and a short wait for the CoreWeave GPU cloud to bring up the compute and networking infrastructure. Once the predictor and transformer services were ready, I used curl to submit my first prompt to the transformer endpoint:

% curl -d '{"instances": ["That was easy"]}' http://gpt-s3-transformer-default.tenant-dead0a.knative.chi.coreweave.com/v1/models/gpt-s3:predict
{"predictions": ["That was easy for some people, it's just impossible for me,\" Davis said. \"I'm still trying to" ]}

In the video, I repeated the exercise, feeding GPT-2’s response back into it as a prompt a few times to generate a few paragraphs of text. Here’s what it came up with:

“That was easy: If I had a friend who could take care of my dad for the rest of his life, I would’ve known. If I had a friend who could take care of my kid. He would’ve been better for him than if I had to rely on him for everything.

The problem is, no one is perfect. There are always more people to be around than we think. No one cares what anyone in those parts of Britain believes,

The other problem is that every decision the people we’re trying to help aren’t really theirs. If you have to choose what to do”

If you’ve used ChatGPT, you’ll recognize how far LLMs have come since GPT-2’s release in 2019!

Run Your Own Large Language Model

While CoreWeave’s GPT-2 sample is an excellent introduction to the world of LLMs, it’s a bit limited. If you’re looking to get deeper into generative AI, another sample, Fine-tune Large Language Models with CoreWeave Cloud, shows how to fine-tune a model from the more recent EleutherAI Pythia suite.

Since CoreWeave is a specialized GPU cloud designed to deliver best-in-class performance up to 35x faster and 80% less expensive than generalized public clouds, it’s a great choice for workloads such as AI, ML, rendering, and more, and, as you’ve seen in this blog post, easy to integrate with Backblaze B2 Cloud Storage, with no data transfer costs. For more information, contact the CoreWeave team.

The post How to Run AI/ML Workloads on CoreWeave + Backblaze appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/how-to-run-ai-ml-workloads-on-coreweave-backblaze/feed/ 0
Free Your Premiere Pro Workflows With Backblaze Cloud Storage https://www.backblaze.com/blog/free-your-premiere-pro-workflows-with-backblaze-cloud-storage/ https://www.backblaze.com/blog/free-your-premiere-pro-workflows-with-backblaze-cloud-storage/#respond Tue, 11 Jul 2023 16:32:40 +0000 https://www.backblaze.com/blog/?p=109186 Read about James Flores' recent experience using Backblaze B2 Cloud Storage with Adobe Premiere Pro to create seamless, remote media workflows.

The post Free Your Premiere Pro Workflows With Backblaze Cloud Storage appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
A decorative image showing a mockup of Premiere Pro's user interface and the Backblaze storage cloud.

Projects and technologies come and go, and with each new tool comes new workflow changes. But changing the way you move media around can be tough. Maybe you’ve always done things a certain way, and using a new tool feels like too much of a learning curve especially when you’re pressed for time. But the way you’ve always done things isn’t always the best, easiest, or fastest way. Sometimes you need to change the status quo to level up your media operations. 

As a freelance editor, I worked on a recent project that presented some challenges that demanded new approaches to media storage challenges you might also be facing. I solved them with the cloud—but not an all-in-one cloud. My solution was a mix of cloud tools, including Adobe Premiere Pro, which gives me customization and flexibility—the best of all worlds in media workflows

Right Opportunity at the Right Time

Last year I had the opportunity to serve as a digital imaging technician (DIT) on the set of an indie film titled “Vengeance” produced by Falcon Pictures. The role of a DIT can vary. In many instances you’re simply a data wrangler making backups of the data being shot. In others, you work in the color space of the project creating color corrected dailies on set. For “Vengeance”, I was mostly data wrangling. 

“Vengeance” was an 11-day shoot in the mountains of Northern California near Bass Lake. While the rest of the crew spent their days hiking around with equipment, I was stationed back at home base with my DIT cart. With a lot of free time, I found myself logging data as it came in. Logging clip names soon turned into organizing bins and prepping the project for editing. And, while I was not the editor on the project, I was happy to help edit while I was on set. 

The Challenge

A few months after my work as DIT ended, it became clear that “Vengeance” needed a boost in post-production. The editing was a bit stuck—they had no assistant editor to complete logging and to sound sync all the footage. So, I was asked to help out. The only problem: I needed to be able to share my work with another editor who lived 45 miles away.

A screenshot of an indie film, Vengeance, being edited in Adobe Premiere Pro.
Editing “Vengeance” in Adobe Premiere Pro.

Evaluating the World of Workflows and Cloud Tools

So we began to evaluate a few different solutions. It was clear that Adobe Premiere Pro would be used, but data storage was still a big question. We debated a few methods for sharing media:

  1. The traditional route: Sharing a studio. With the other editor 45 miles away, commuting and scheduling time with each other was going to be cumbersome. 
  2. Email: We could email project files back and forth as we worked, but how would we keep track of versioning? Project bloat was a big concern. 
  3. Sharing a shuttle drive. Or what I’m calling “Sneakernet 2.0.” This is a popular method, but far from efficient. 
  4. Google Drive or Dropbox: Another popular option, but also one that comes with costs and service limitations like rate limiting. 

None of these options were great, so we went back to the drawing board. 

The Solution: A Hybrid Workflow Designed for Our Needs

To come to a final decision for this workflow, we made a list of our needs: 

  • The ability to share a Premiere Pro project file for updates. 
  • The ability to share media for the project. 
  • No exchanging external hard drives. 
  • No driving (a car).  
  • Changes need to be real time.

Based on those needs, here’s where we landed.

Sharing Project Files

Adobe recently released a new update to its Team Projects features within Premiere Pro. Team Projects allows you to host a Premiere Pro project in the Adobe cloud and share it with other Adobe Creative Cloud users. This gave us the flexibility to share a single project and share updates in real time. This means no emailing of project files, versioning issues, or bloated files. That left the issues of the media. How do we share media? 

Sharing Media Files

You may think that it would be obvious to share files in the Adobe Creative Cloud where you get 100GB free. And while 100GB may be enough storage for .psd and .ai files, 100GB is nothing for video, especially when we are talking about RED (.r3d) files which start off as approximately 4GB chunks and can quickly add up to terabytes of footage. 

So we put everything in a Backblaze B2 Bucket. All the .r3d source files went directly from my Synology network attached storage (NAS) into a Backblaze B2 Bucket using the Synology Cloud Sync tool. In addition to the source files, I used Adobe Media Encoder to generate proxy files of all the .r3d files. This folder of proxy files also synced with Backblaze automatically. 

Making Changes in Real Time

What was great about this solution is that all of the uploading is done automatically via a seamless Backblaze + Synology integration, and the Premiere Pro Team Project had a slew of publish functions perfect for real-time updates. And because the project files and proxies are stored in the cloud, I could get to them from several computers. I spent time at my desktop PC logging and syncing footage, but was also able to move to my couch and do the same from my MacBook Pro. I never had to move hard drives around, copy projects files, or worry about version control.

The other editor was able to connect to my Backblaze B2 Bucket using Cyberduck, a cloud storage browser for Mac. Using Cyberduck, he was able to pull down all the proxy files I created and share any files that he created. So, we were synced for the entire duration of the project. 

Once the technology was configured, I was able to finish logging for “Vengeance”, sync all the sound, build out stringouts and assemblies, and even a rough cut of every scene for the entire movie, giving the post-production process the boost it needed.

A diagram showing how editors use Backblaze B2 Cloud Storage with Adobe Premiere Pro.

The Power of Centralized Storage for Media Workflows

Technology is constantly evolving, and, in most circumstances, technology makes how we work a lot easier. For years filmmakers have worked on projects by physically moving our source material, whether it was on film reels, tapes, or hard drives. The cloud changed all that

The key to getting “Vengeance” through post-production was our centralized approach to file management. Files existed in Backblaze already, we simply brought Premiere Pro to the data rather than moving the huge amount of files to Premiere Pro via the Creative Cloud. 

The mix of technologies lets us create a customized flow that works for us. Creative Cloud had the benefit of providing a project sharing mechanism, and Backblaze provided a method of sharing media (Synology and Cyberduck) regardless of the tooling each editor had. 

Once we hit picture lock, the centralized files will serve as a distribution point for VFX, color, and sound, making turnover a breeze. It can even be used as a distribution hub—check out how American Public Television uses Backblaze to distribute their finished assets. 

Centralizing in the cloud not only made it easy for me to work from home, it allowed us to collaborate on a project with ease eliminating the overhead of driving, shuttle drive delivery (Sneakernet 2.0), and version control. The best part? A workflow like this is affordable for any size production and can be set up in minutes. 

Have you recently moved to a cloud workflow? Let us know what you’re using and how it went in the comments. 

The post Free Your Premiere Pro Workflows With Backblaze Cloud Storage appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/free-your-premiere-pro-workflows-with-backblaze-cloud-storage/feed/ 0
The Power of Specialized Cloud Providers: A Game Changer for SaaS Companies https://www.backblaze.com/blog/the-power-of-specialized-cloud-providers-a-game-changer-for-saas-companies/ https://www.backblaze.com/blog/the-power-of-specialized-cloud-providers-a-game-changer-for-saas-companies/#respond Tue, 13 Jun 2023 16:40:34 +0000 https://www.backblaze.com/blog/?p=108971 Cloud-based tech stacks have moved beyond a one-size-fits all approach. Here's how specialized cloud providers can help your SaaS company customize its tech stack.

The post The Power of Specialized Cloud Providers: A Game Changer for SaaS Companies appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
A decorative image showing a cloud with the Backblaze logo, then logos hanging off it it for Vultr, Fastly, Equinix metal, Terraform, and rclone.

“Nobody ever got fired for buying AWS.” It’s true: AWS’s one-size-fits-all solution worked great for most businesses, and those businesses made the shift away from the traditional model of on-prem and self-hosted servers—what we think of as Cloud 1.0—to an era where AWS was the cloud, the one and only, which is what we call Cloud 2.0. However, as the cloud landscape evolves, it’s time to question the old ways. Maybe nobody ever got fired for buying AWS, but these days, you can certainly get a lot of value (and kudos) for exploring other options. 

Developers and IT teams might hesitate when it comes to moving away from AWS, but AWS comes with risks, too. If you don’t have the resources to manage and maintain your infrastructure, costs can get out of control, for one. As we enter Cloud 3.0 where the landscape is defined by the open, multi-cloud internet, there is an emerging trend that is worth considering: the rise of specialized cloud providers.

Today, I’m sharing how software as a service (SaaS) startups and modern businesses can take advantage of these highly-focused, tailored services, each specializing and excelling in specific areas like cloud storage, content delivery, cloud compute, and more. Building on a specialized stack offers more control, return on investment, and flexibility, while being able to achieve the same performance you expect from hyperscaler infrastructure.

From a cost of goods sold perspective, AWS pricing wasn’t a great fit. From an engineering perspective, we didn’t want a net-new platform. So the fact that we got both with Backblaze—a drop-in API replacement with a much better cost structure—it was just a no-brainer.

—Rory Petty, Co-Founder & CTO, Tribute

The Rise of Specialized Cloud Providers

Specialized providers—including content delivery networks (CDNs) like Fastly, bunny.net, and Cloudflare, as well as cloud compute providers like Vultr—offer services that focus on a particular area of the infrastructure stack. Rather than trying to be everything to everyone, like the hyperscalers of Cloud 2.0, they do one thing and do it really well. Customers get best-of-breed services that allow them to build a tech stack tailored to their needs. 

Use Cases for Specialized Cloud Providers

There are a number of businesses that might benefit from switching from hyperscalers to specialized cloud providers, including:

In order for businesses to take advantage of the benefits (since most applications rely on more than just one service), these services must work together seamlessly. 

Let’s Take a Closer Look at How Specialized Stacks Can Work For You

If you’re wondering how exactly specialized clouds can “play well with each other,” we ran a whole series of application storage webinars that talk through specific examples and uses cases. I’ll share what’s in it for you below.

1. Low Latency Multi-Region Content Delivery with Fastly and Backblaze

Did you know a 100-millisecond delay in website load time can hurt conversion rates by 7%? In this session, Pat Patterson from Backblaze and Jim Bartos from Fastly discuss the importance of speed and latency in user experience. They highlight how Backblaze’s B2 Cloud Storage and Fastly’s content delivery network work together to deliver content quickly and efficiently across multiple regions. Businesses can ensure that their content is delivered with low latency, reducing delays and optimizing user experience regardless of the user’s location.

2. Scaling Media Delivery Workflows with bunny.net and Backblaze

Delivering content to your end users at scale can be challenging and costly. Users expect exceptional web and mobile experiences with snappy load times and zero buffering. Anything less than an instantaneous response may cause them to bounce. 

In this webinar, Pat Patterson demonstrates how to efficiently scale your content delivery workflows from content ingestion, transcoding, storage, to last-mile acceleration via bunny.net CDN. Pat demonstrates how to build a video hosting platform called “Cat Tube” and shows how to upload a video and play it using HTML5 video element with controls. Watch below and download the demo code to try it yourself.

3. Balancing Cloud Cost and Performance with Fastly and Backblaze

With a global economic slowdown, IT and development teams are looking for ways to slash cloud budgets without compromising performance. E-commerce, SaaS platforms, and streaming applications all rely on high-performant infrastructure, but balancing bandwidth and storage costs can be challenging. In this 45-minute session, we explored how to recession-proof your growing business with key cloud optimization strategies, including ways to leverage Fastly’s CDN to balance bandwidth costs while avoiding performance tradeoffs.

4. Reducing Cloud OpEx Without Sacrificing Performance and Speed

Greg Hamer from Backblaze and DJ Johnson from Vultr explore the benefits of building on best-of-breed, specialized cloud stacks tailored to your business model, rather than being locked into traditional hyperscaler infrastructure. They cover real-world use cases, including:

  • How Can Stock Photo broke free from AWS and reduced their cloud bill by 55% while achieving 4x faster generation.
  • How Monument Labs launched a new cloud-based photo management service to 25,000+ users.
  • How Black.ai processes 1000s of files simultaneously, with a significant reduction of infrastructure costs.

5. Leveling Up a Global Gaming Platform while Slashing Cloud Spend by 85%

James Ross of Nodecraft, an online gaming platform that aims to make gaming online easy, shares how he moved his global game server platform from Amazon S3 to Backblaze B2 for greater flexibility and 85% savings on storage and egress. He discusses the challenges of managing large files over the public internet, which can result in expensive bandwidth costs. By storing game titles on Backblaze B2 and delivering them through Cloudflare’s CDN, they achieve reduced latency since games are cached at the edge, and pay zero egress fees thanks to the Bandwidth Alliance. Nodecraft also benefited from Universal Data Migration, which allows customers to move large amounts of data from any cloud services or on-premises storage to Backblaze’s B2 Cloud Storage, managed by Backblaze and free of charge.

Migrating From a Hyperscaler

Though it may seem daunting to transition from a hyperscaler to a specialized cloud provider, it doesn’t have to be. Many specialized providers offer tools and services to make the transition as smooth as possible. 

  • S3-compatible APIs, SDKs, CLI: Interface with storage as you would with Amazon S3—switching can be as easy as dropping in a new storage target.
  • Universal Data Migration: Free and fully managed migrations to make switching as seamless as possible.
  • Free egress: Move data freely with the Bandwidth Alliance and other partnerships between specialized cloud storage providers.

As the decision maker at your growing SaaS company, it’s worth considering whether a specialized cloud stack could be a better fit for your business. By doing so you could potentially unlock cost savings, improve performance, and gain flexibility to adapt your services to your unique needs. The one-size-fits-all is no longer the only option out there. 

Want to Test It Out Yourself?

Take a proactive approach to cloud cost management: Get 10GB free to test and validate your proof of concept (POC) with Backblaze B2. All it takes is an email to get started.

Download the Ransomware Guide ➔ 

The post The Power of Specialized Cloud Providers: A Game Changer for SaaS Companies appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/the-power-of-specialized-cloud-providers-a-game-changer-for-saas-companies/feed/ 0
Unlocking Media Collaboration: How to Use Hybrid Cloud to Boost Productivity https://www.backblaze.com/blog/unlocking-media-collaboration-how-to-use-hybrid-cloud-to-boost-productivity/ https://www.backblaze.com/blog/unlocking-media-collaboration-how-to-use-hybrid-cloud-to-boost-productivity/#respond Tue, 30 May 2023 16:12:57 +0000 https://www.backblaze.com/blog/?p=108856 For modern media teams, speed and collaboration are key. Here are some ways that a Synology + Backblaze B2 hybrid cloud setup can benefit media and entertainment professionals.

The post Unlocking Media Collaboration: How to Use Hybrid Cloud to Boost Productivity appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
A decorative image showing a Synology NAS with various icons representing file types going up into a cloud with a Backblaze logo.

In today’s fast-paced media landscape, efficient collaboration is essential for success. With teams managing large files between geographically dispersed team members on tight deadlines, the need for a robust, flexible storage solution has never been greater. Hybrid cloud storage addresses this need by combining the power of on-premises solutions, like network attached storage (NAS) devices, with cloud storage, creating an ideal setup for enhanced productivity and seamless collaboration. 

In this post, I’ll walk you through some approaches for optimizing media workflows using hybrid cloud storage. You’ll learn how to unlock fast local storage, easy file sharing and collaboration, and enhanced data protection, which are all essential components for success in the media and entertainment industry. 

Plus, we’ll share specific media workflows for different types of collaboration scenarios and practical steps you can take to get started with your hybrid cloud approach today using Synology NAS and Backblaze B2 Cloud Storage as an example.

Common Challenges for Media Teams

Before we explore a hybrid cloud approach that combines NAS devices with cloud storage, let’s first take a look at some of the common challenges media teams face, including:

  • Data storage and accessibility.
  • File sharing and collaboration.
  • Security and data protection.

Data Storage and Accessibility Challenges

It’s no secret that recent data growth has been exponential. This is no different for media files. Cameras are creating larger and higher-quality files. There are more projects to shoot and edit. And editors and team members require immediate access to those files due to the high demand for fresh content.

File Sharing and Collaboration Challenges

Back in 2020, everyone was forced to go remote and the workforce changed. Now you can hire freelancers and vendors from around the world. This means you have to share assets with external contributors, and, in the past, this used to exclusively mean shipping hard drives to said vendors (and sometimes, it can still be necessary). Different contractors, freelancers, and consultants may use different tools and different processes.

Security and Data Protection Challenges

Data security poses unique challenges for media teams due to the industry’s specific requirements including managing large files, storing data on physical devices, and working with remote teams and external stakeholders. The need to protect sensitive information and intellectual property from data breaches, accidental deletions, and device failures adds complexity to data protection initiatives. 

How Does Hybrid Cloud Help Media Teams Solve These Challenges?

As a quick reminder, the hybrid cloud refers to a computing environment that combines the use of both private cloud and public cloud resources to achieve the benefits of each platform.

illustration of hybrid cloud workflow

A private cloud is a dedicated and secure cloud infrastructure designed exclusively for a single tenant or organization. It offers a wide range of benefits to users. With NAS devices, organizations can enjoy centralized storage, ensuring all files are accessible in one location. Additionally, it offers fast local access to files that helps streamline workflows and productivity. 

The public cloud, on the other hand, is a shared cloud infrastructure provided by cloud storage companies like Backblaze. With public cloud, organizations can scale their infrastructure up or down as needed without the up-front capital costs associated with traditional on-premises infrastructure. 

By combining cloud storage with NAS, media teams can create a hybrid cloud solution that offers the best of both worlds. Private local storage on NAS offers fast access to large files while the public cloud securely stores those files in remote servers and keeps them accessible at a reasonable price.

How To Get Started With A Hybrid Cloud Approach

If you’d like to get started with a hybrid cloud approach, using NAS on-premises is an easy entry point. Here are a few tips to help you choose the right NAS device for your data storage and collaboration needs. 

  • Storage Requirements: Begin by assessing your data volume and growth rate to determine how much storage capacity you’ll need. This will help you decide the number of drives required to support your data growth. 
  • Compute Power: Evaluate the NAS device’s processor, controller, and memory to ensure it can handle the workloads and deliver the performance you need for running applications and accessing and sharing files.
  • Network Infrastructure: Consider the network bandwidth, speed, and port support offered by the NAS device. A device with faster network connectivity will improve data transfer rates, while multiple ports can facilitate the connection of additional devices.
  • Data Collaboration: Determine your requirements for remote access, sync direction, and security needs. Look for a NAS device that provides secure remote access options, and supports the desired sync direction (one-way or two-way) while offering data protection features such as encryption, user authentication, and access controls. 

By carefully reviewing these factors, you can choose a NAS device that meets your storage, performance, network, and security needs. If you’d like additional help choosing the right NAS device, download our complete NAS Buyer’s Guide. 

Download the Guide ➔

Real-World Examples: Using Synology NAS + Backblaze B2

Let’s explore a hybrid cloud use case. To discuss specific media workflows for different types of collaboration scenarios, we’re going to use Synology NAS as the private cloud and Backblaze B2 Cloud Storage as the public cloud as examples in the rest of this article. 

Scenario 1: Working With Distributed Teams Across Locations

In the first scenario, let’s assume your organization has two different locations with your teams working from both locations. Your video editors work in one office, while a separate editorial team responsible for final reviews operates from the second location. 

To facilitate seamless collaboration, you can install a Synology NAS device at both locations and connect them to Backblaze B2 using Cloud Sync. 

illustration of hybrid cloud workflow using Synology NAS and Backblaze B2 Cloud Storage

Here’s a video guide that demonstrates how to synchronize Synology NAS to Backblaze B2 using cloud sync.

https://youtu.be/oQOYcuEvjPA

This hybrid cloud setup allows for fast local access, easy file sharing, and real-time synchronization between the two locations, ensuring that any changes made at one site are automatically updated in the cloud and mirrored at the other site.

Scenario 2: Working With Distributed Teams

In this second scenario, you have teams working on your projects from different regions, let’s say the U.S. and Europe. Downloading files from different parts of the world can be time-consuming, causing delays and impacting productivity. To solve this, you can use Backblaze B2 Cloud Replication. This allows you to replicate your data automatically from your source bucket (U.S. West) to a destination bucket (EU Central). 

Source files can be uploaded into B2 Bucket on the U.S. West region. These files are then replicated to the EU Central region so you can move data closer to your team in Europe for faster access. Vendors and teams in Europe can configure their Synology NAS devices with Cloud Sync to automatically sync with the replicated files in the EU Central data center.

illustration of hybrid cloud with data replication across geographic regions

Scenario 3: Working With Freelancers

In both scenarios discussed so far, file exchanges can occur between different companies or within the same company across various regions of the world. However, not everyone has access to these resources. Freelancers make up a huge part of the media and entertainment workforce, and not every one of them has a Synology NAS device. 

But that’s not a problem! 

In this case, you can still use a Synology NAS to upload your project files and sync them with your Backblaze B2 Bucket. Instead of syncing to another NAS or replicating to a different region, freelancers can access the files in your Backblaze B2 Bucket using third-party tools like Cyberduck

illustration of hybrid cloud media workflow using Synology NAS and Backblaze B2 Cloud Storage

This approach allows anyone with an internet connection and the appropriate access keys to access the required files instantly without needing to have a NAS device.

Scenario 4: Working With Vendors

In this final scenario, which is similar to the first one, you collaborate with another company or vendor located elsewhere instead of working with your internal team. Both parties can install their own Synology NAS device at their respective locations, ensuring centralized access, fast local access, and easy file sharing and collaboration. 

The two NAS devices are then connected to a Backblaze B2 Bucket using Cloud Sync, allowing for seamless synchronization of files and data between the two companies.

illustration of collaborative media workflow

Whenever changes are made to files by one company, the updated files are automatically synced to Backblaze B2 and subsequently to the other company’s Synology NAS device. This real-time synchronization ensures that both companies have access to the latest versions of the files, allowing for increased efficiency and collaboration. 

Making Hybrid Cloud Work for Your Production Team

As you can see, there are several different ways you can move your media files around and get them in the hands of the right people—be it another one of your offices, vendors, or freelancers. The four scenarios discussed here are just a few common media workflows. You may or may not have the same scenario. Regardless, a hybrid cloud approach provides you with all the tools you need to customize your workflow to best suit your media collaboration needs.

Ready to Get Started?

With Backblaze B2’s pre-built integration with Synology NAS’s Cloud Sync, getting started with your hybrid cloud approach using Synology NAS and Backblaze B2 is simple and straightforward. Check out our guide, or watch the video below as Pat Patterson, Backblaze Chief Technical Evangelist, walks through how to get your Synology NAS data into B2 Cloud Storage in under 10 minutes using Cloud Sync.

Your first step is creating an account.

https://youtu.be/oQOYcuEvjPA

In addition to Synology NAS, Backblaze B2 Cloud Storage integrates seamlessly with other NAS devices such as Asustor, Ctera, Dell Isilon, iOsafe, Morro Data, OWC JellyFish, Panzura, QNAP, TrueNAS, and more. Regardless of which NAS device you use, getting started with a hybrid cloud approach is simple and straightforward with Backblaze B2.

Hybrid Cloud Unlocks Collaboration and Productivity for Media Teams

Easing collaboration and boosting productivity in today’s fast-paced digital landscape is vital for media teams. By leveraging a hybrid cloud storage solution that combines the power of NAS devices with the flexibility of cloud storage, organizations can create an efficient, scalable, and secure solution for managing their media assets. 

This approach not only addresses storage capacity and accessibility challenges, but also simplifies file sharing and collaboration, while ensuring data protection and security. Whether you’re working within your team from different locations, collaborating with external partners, or freelancers, a hybrid cloud solution offers a seamless, cost-effective, and high-performance solution to optimize your media workflows and enhance productivity in the ever-evolving world of media and entertainment. 

We’d love to hear about other different media workflow scenarios. Share with us how you collaborate with your media teams and vendors in the comments below. 

The post Unlocking Media Collaboration: How to Use Hybrid Cloud to Boost Productivity appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/unlocking-media-collaboration-how-to-use-hybrid-cloud-to-boost-productivity/feed/ 0
Object Storage for Film, Video, and Content Creation https://www.backblaze.com/blog/object-storage-for-film-video-and-content-creation/ https://www.backblaze.com/blog/object-storage-for-film-video-and-content-creation/#respond Thu, 23 Mar 2023 16:25:11 +0000 https://www.backblaze.com/blog/?p=108356 Traditionally, one of the biggest challenges in filmmaking and content creation has been storage. Now, cloud object storage has opened up new possibilities. Read on to learn more.

The post Object Storage for Film, Video, and Content Creation appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
A decorative image showing icons representing drives and storage options superimposed on a cloud. A title reads: Object Storage for Media Workflows

Twenty years ago, who would have thought going to work would mean spending most of your time on a computer and running most of your applications through a web browser or a mobile app? Today, we can do everything remotely via the power of the internet—from email to gaming, from viewing our home security cameras to watching the latest and greatest movie trailers—and we all have opinions about the best browsers, too…

Along with that easy, remote access, a slew of new cloud technologies are fueling the tech we use day in and day out. To get to where we are today, the tech industry had to rethink some common understandings, especially around data storage and delivery. Gone are the days that you save a file on your laptop, then transport a copy of that file via USB drive or CD-ROM (or, dare we say, a floppy disk) so that you can keep working on it at the library or your office. And, those same common understandings are now being reckoned with in the world of film, video, and content creation.

In this post, I’ll dive into storage, specifically cloud object storage, and what it means for the future of content creation, not only for independent filmmakers and content creators, but also in post-production workflows.

The Evolution of File Management

If you are reading this blog you are probably familiar with a storage file system—think Windows Explorer, the Finder on Mac, or directory structures in Linux. You know how to create a folder, create files, move files, and delete folders. This same file structure has made its way into cloud services such as Google Drive, Box, and Dropbox. And many of these technologies have been adopted to store some of the largest content, namely media files like .mp4, .wav, or .r3d files.

But, as camera file outputs grow larger and larger and the amount of content generated by creative teams soars, folders structures get more and more complex. Why is this important?

Well, ask yourself: How much time have you spent searching for clips you know exist, but just can’t seem to find? Sure, you can use search tools to search your folder structure but as you have more and more content, that means searching for the proverbial needle in a haystack—naming conventions can only do so much, especially when you have dozens or hundreds of people adding raw footage, creating new versions, and so on.

Finding files in a complex file structure can take so much time that many of the aforementioned companies create system limits preventing long searches. In addition, they may limit uploads and downloads making it difficult to manage the terabytes of data a modern production creates. So, this all begs the question: Is a traditional file system really the best for scaling up, especially in data-heavy industries like filmmaking and video content creation? Enter: Cloud object storage.

Refresher: What is Object Storage?

You can think of object storage as simply a big pool of storage space filled with object data. In the past we’ve defined object data as “some assemblage of data with one unique identifier and an infinite amount of metadata.” The three components that comprise objects in object storage are key here. They include:

  1. Unique Identifier: Referred to as a universally unique identifier (UUID) or global unique identifier (GUID), this is simply a complex number identifier.
  2. Infinite Metadata: Data about the data with endless possibilities.
  3. Data: The actual data we are storing.

So what does that actually mean?

It means each object (this can be any type of file—a .jpg, .mp4, .wav, .r3d, etc.) has an automatically generated unique identifier which is just a number (e.g. 4_z6b84cf3535395) versus a folder structure path you must manually create and maintain (e.g. D:\Projects\JOB4548\Assets\RAW\A001\A001_3424OP.RDM\A001_34240KU.RDC\
A001_A001_1005ku_001.R3D).

An image of a card catalog.
Interestingly enough, this is where metadata comes from.

It also means each object can have an infinite amount of metadata attached to it. Metadata, put simply, is a “tag” that identifies how the file is used or stored. There are several examples of metadata, but here are just a few:

  • Descriptive metadata, like the title or author.
  • Structural metadata, like how to order pages in a chapter.
  • Administrative metadata, like when the file was created, who has permissions to it, and so on.
  • Legal metadata, like who holds the copyright or if the file is in the public domain.

So, when you’re saying an image file is 400×400 pixels and in .jpg format, you’ve just identified two pieces of metadata about the file. In filmmaking, metadata can include things like reel numbers or descriptions. And, as artificial intelligence (AI) and machine learning tools continue to evolve, the amount of metadata about a given piece of footage or image only continues to grow. AI tools can add data around scene details, facial recognition, and other identifiers, and since those are coded as metadata, you will be able to store and search files using terms like “scenes with Bugs Bunny” or “scenes that are in a field of wildflowers”—and that means that you’ll spend less time trying to find the footage you need when you’re editing.

When you put it all together, you have one gigantic content pool that can grow infinitely. It uses no manually created complex folder structure and naming conventions. And it can hold an infinite amount of data about your data (metadata), making your files more discoverable.

Let’s Talk About Object Storage for Content Creation

You might be wondering: What does this have to do with the content I’m creating?

Consider this: When you’re editing a project, how much of your time is spent searching for files? A recent study by GISTICS found that the average creative person searches for media 83 times a week. Maybe you’re searching your local hard drive first, then your NAS, then those USB drives in your closet. Or, maybe you are restoring content off an LTO tape to search for that one clip you need. Or, maybe you moved some of your content to the cloud—is it in your Google Drive or in your Dropbox account? If so, which folder is it in? Or was it the corporate Box account? Do you have permissions to that folder? All of that complexity means that the average creative person fails to find the media they are looking for 35% of the time. But you probably don’t need a study to tell you we all spent huge amounts of time searching for content.

An image showing a command line interface window with a failed search.
Good old “request timed out.”

Here is where object storage can help. With object storage, you simply have buckets (object storage containers) where all your data can live, and you can access it from wherever you’re working. That means all of the data stored on those shuttle drives sitting around your office, your closet of LTO tapes, and even a replica of your online NAS are in a central, easily accessible location. You’re also working from the most recent file.

Once it’s in the cloud, it’s safe from the types of disasters that affect on-premises storage systems, and it’s easy to secure your files, create backups, and so on. It’s also readily available when you need it, and much easier to share with other team members. It’s no wonder many of the apps you use today take advantage of object storage as their primary storage mechanism.

The Benefits of Object Storage for Media Workflows

Object storage offers a number of benefits for creative teams when it comes to streamlining workflows, including:

  • Instant access
  • Integrations
  • Workflow interoperability
  • Easy distribution
  • Off-site back up and archive

Instant Access

With cloud object storage, content is ready when you need it. You know inspiration can strike at any time. You could be knee deep in editing a project, in the middle of binge watching the latest limited series, or out for a walk. Whenever the inspiration decides to strike, having instant access to your library of content is a game changer. And that’s the great thing about object storage in the cloud: you gain access to massive amounts of data with a few clicks.

Integrations

Object storage is a key component of many of the content production tools in use today. For example, iconik is a cloud-native media asset management (MAM) tool that can gather and organize media from any storage location. You can point iconik to your Backblaze B2 Bucket and use its advanced search functions as well as its metadata tagging.

Workflow Interoperability

What if you don’t want to use iconik, specifically? What’s great about using cloud storage as a centralized repository is that no matter what application you use, your data is in a single place. Think of it like your external hard drive or NAS—you just connect that drive with a new tool, and you don’t have to worry about downloading everything to move to the latest and greatest. In essence, you are bringing your own storage (BYOS!).

Here’s an example: CuttingRoom is a cloud native video editing and collaboration tool. It runs entirely in your web browser and lets you create unique stories that can instantly be published to your destination of choice. What’s great about CuttingRoom is its ability to read an object storage bucket as a source. By simply pointing CuttingRoom to a Backblaze B2 Bucket, it has immediate access to the media source files and you can get to editing. On the other hand, if you prefer using a MAM, that same bucket can be indexed by a tool like iconik.

Easy Distribution

Now that your edit is done, it’s time to distribute your content to the world. Or, perhaps you are working with other teams to perfect your color and sound, and it’s time to share your picture lock version. Cloud storage is ready for you to distribute your files to the next team or an end user.

Here’s a recent, real-world example: If you have been following the behind-the-scenes articles about creating Avatar: The Way of Water, you know that not only was its creation the spark of new technology like the Sony Venice camera with removable sensors, but the distribution featured a cloud centric flow. Footage (the film) was placed in an object store (read: a cloud storage database), processed into different formats, languages were added with 3D captions, and then footage was distributed directly from a central location.

And, while not all of us have Jon Landau as our producer, a huge budget, and a decade to create our product, this same flexibility exists today with object storage—with the added bonus that it’s usually budget-friendly as well.

Off-Site Back Up and Archive

And last but certainly not least, let’s talk back up and archive. Once a project is done, you need space for the next project, but no one wants to risk losing the old project. Who out there is completely comfortable hitting the delete key as well as saying yes to the scary prompt, “Are you sure you want to delete?”

Well, that’s what you would have to do in the past. These days, object storage is a great place to store your terabytes and terabytes of archived footage without cluttering your home, office, or set with additional hardware. Compared with on-premises storage, cloud storage lets you add more capacity as you need it—just make sure you understand cloud storage pricing models so that you’re getting the best bang for your buck.

If you’re using a NAS device in your media workflow, you’ll find you need to free up your on-prem storage. Many NAS devices, like Synology and QNAP, have cloud storage integrations that allow you to automatically sync and archive data from your device to the cloud. In fact, you could start taking advantage of this today.

No delete key here—just a friendly archive button.

Getting Started With Object Storage for Media Workflows

Migrating to the cloud may seem daunting, but it doesn’t have to be. Especially with the acceleration of hybrid workflows in the film industry recently, cloud-based workflows are becoming more common and better integrated with the tools we use every day. You can test this out with Backblaze using your free 10GB that you get just for signing up for Backblaze B2. Sure, that may not seem like much when a single .r3d file is 4GB. But with that 10GB, you can test upload speeds and download speeds, try out integrations with your preferred workflow tools, and experiment with AI metadata. If your team is remote, you could try an integration with LucidLink. Or if you’re looking to power a video on-demand site, you could integrate with one of our content delivery network (CDN) partners to test out content distribution, like Backblaze customer Kanopy, a streaming service that delivers 25,000 videos to libraries worldwide.

Change is hard, but cloud storage can be easy. Check out all of our media workflow solutions and get started with your 10GB free today.

The post Object Storage for Film, Video, and Content Creation appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/object-storage-for-film-video-and-content-creation/feed/ 0
Virtual vs. Remote vs. Hybrid Production https://www.backblaze.com/blog/virtual-vs-remote-vs-hybrid-production/ https://www.backblaze.com/blog/virtual-vs-remote-vs-hybrid-production/#comments Tue, 14 Mar 2023 16:12:36 +0000 https://www.backblaze.com/blog/?p=108250 Media workflows have changed—for the better. Let's discuss the differences between remote, virtual, and hybrid production, and how to leverage cloud-friendly workflows and make your team more efficient.

The post Virtual vs. Remote vs. Hybrid Production appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
A decorative image showing icons of a NAS device, a video with a superimposed play button, and a cloud with the Backblaze logo.

For many of us, 2020 transformed our work habits. Changes to the way we work that always seemed years away got rolled out within a few months. Fast forward to today, and the world seems to be returning back to some sense of normalcy. But one thing that’s not going back is how we work, especially for media production teams. Virtual production, remote video production, and hybrid cloud have all accelerated, reducing operating costs and moving us closer to a cloud-based reality.

So what’s the difference between virtual production, remote production, and hybrid cloud workflows, and how can you use any or all of those strategies to improve how you work? At first glance, they all seem to be different variations of the same thing. But there are important differences, and that’s what we’re digging into today. Read on to get an understanding of these new ways of working and what they mean for your creative team.

Going to NAB in April?

Want to talk about your production setup at NAB? Backblaze will be there with exciting new updates and hands-on demos for better media workflows. Oh, and we’re bringing some really hot swag. Reserve time to meet with our team (and snap up some sweet goodies) below.

➔ Meet Backblaze at NAB

What Is Virtual Production?

Let’s start with virtual production. It sounds like doing production virtually, which could just mean “in the cloud.” I can assure you, it’s way cooler than that. When the pandemic hit, social distancing became the norm. Gathering a film crew together in a studio or in any location of the world went out the door. Never fear: virtual production came to the rescue.

Virtual production is a method of production where, instead of building a set or going to a specific location, you build a set virtually, usually with a gaming engine such as Unreal Engine. Once the environment is designed and lit within Unreal Engine, it can then be fed to an LED volume. An LED volume is exactly what it sounds like: a huge volume of LED screens connected to a single input (the Unreal Engine environment).

With virtual production, your set becomes the LED volume, and Unreal Engine can change the background to anything you can imagine at the click of a button. Now this isn’t just a LED screen as a background—what makes virtual production so powerful is its motion tracking integration with real cameras.

Using a motion sensor system attached to a camera, Unreal Engine is able to understand where your camera is pointed. (It’s way more tech-y than that, but you get the picture.) You can even match the virtual lens in Unreal Engine with the lens of your physical camera. With the two systems combined, a camera following an actor on a virtual set can react by moving the background along with the camera in real time.

Virtual Production in Action

If you were one of the millions who have been watching The Mandalorian on Disney+, check out this behind the scenes look at how they utilized a virtual production.

 

This also means location scouting can be done entirely inside the virtual set and the assets created for pre-vizualiation can actually carry on into post, saving a ton of time (as the post work actually starts during pre-production.

So, virtual production is easily confused with remote production, but it’s not the same. We’ll get into remote production next.

What Is Remote Production?

We’re all familiar with the stages of production: development, pre-production, production, post-production, and distribution. Remote production has more to do with post-production. Remote production is simply the ability to handle post-production tasks from anywhere.

Here’s how the pandemic accelerated remote production: In post, assets are edited on non-linear editing software (NLEs) connected to huge storage systems located deep within studios and post-production houses. When everyone was forced to work from home, it made editing quite difficult. There were, of course, solutions that allowed you to remotely control your edit bay, but remotely controlling a system from miles away and trying to scrub videos over your at-home internet bandwidth quickly became a nuisance.

To solve this problem, everyone just took their edit bay home along with a hard drive containing what they needed for their particular project. But shuttling drives all over the place and trying to correlate files across all the remote drives meant that the NAS became the next headache. To resolve this confusion over storage, production houses turned to hybrid solutions—our next topic.

What Are Hybrid Cloud Workflows?

Hybrid cloud workflows didn’t originate during the pandemic, but they did make remote production much easier. A hybrid cloud workflow is a combination of a public cloud, private cloud, and an on-premises solution like a network attached storage device (NAS) or storage area network (SAN). When we think about storage, we think about first the relationship of our NLE to our local hard drive, then our relationship between the local computer and the NAS or SAN. The next iteration of this is the relationship of all of these (NLE, local computer, and NAS/SAN) to the cloud.

For each of these on-prem solutions the primary problems faced are capacity and availability. How much can our drive hold, and how do I access the NAS—local area network (LAN) or virtual private network (VPN)? Storage in the cloud inherently solves both of these problems. It’s always available and accessible from any location with an internet connection. So, to solve the problems that remote teams of editors, visual effects (VFX), color, and sound folks faced, the cloud was integrated into many workflows.

Using the cloud, companies are able to store content in a single location where it can then be distributed to different teams (VFX, color, sound, etc.). This central repository makes it possible to move large amounts of data across different regions, making it easier for your team to access it while also keeping it secure. Many NAS devices have native cloud integrations, so the automated file synchronization between the cloud and a local environment is baked in—teams can just get to work.

The hybrid solution worked so well that many studios and post houses have adopted them as a permanent part of their workflow and have incorporated remote production into their day-to-day. A good example is the video team at Hagerty, a production crew that creates 300+ videos a year. This means that workflows that were once locked down to specific locations are now moving to the cloud. Now more than ever, API accessible resources, like cloud storage with S3 compatible APIs that integrates with your preferred tools, are needed to make these workflows actually work.

Just one example of Hagerty’s content.

Hybrid Workflows and Cloud Storage

While the world seems to be returning to a new normal, our way of work is not. For the media and entertainment world, the pandemic gave the space a jolt of electricity, igniting the next wave of innovation. Virtual production, remote production, and hybrid workflows are here to stay. What digital video started 20 years ago, the pandemic has accelerated, and that acceleration is pointing directly to the cloud.

So, what are your next steps as you future-proof your workflow? First, inspect your current set of tools. Many modern tools are already cloud-ready. For example, a Synology NAS already has Cloud Sync capabilities. EditShare also has a tool capable of crafting custom workflows, wherever your data lives. (These are just a few examples.)

Second, start building and testing. Most cloud providers offer free tiers or free trials—at Backblaze, your first 10GB are free, for example. Testing a proof of concept is the best way to understand how new workflows fit into your system without overhauling the whole thing or potentially disrupting business as usual.

And finally, one thing you definitely need to make hybrid workflows work is cloud storage. If you’re looking to make the change a lot easier, you came to the right place. Backblaze B2 Cloud Storage pairs with hundreds of integrations so you can implement it directly into your established workflows. Check out our partners and our media solutions for more.

The post Virtual vs. Remote vs. Hybrid Production appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/virtual-vs-remote-vs-hybrid-production/feed/ 1
Work Smarter With Backblaze and Quantum H4000 Essential https://www.backblaze.com/blog/work-smarter-with-backblaze-and-quantum-h4000-essential/ https://www.backblaze.com/blog/work-smarter-with-backblaze-and-quantum-h4000-essential/#respond Wed, 08 Mar 2023 17:28:20 +0000 https://www.backblaze.com/blog/?p=108238 Integrate Backblaze and Quantum for simple media workflows, remote collaboration, and seamless content archiving to the cloud.

The post Work Smarter With Backblaze and Quantum H4000 Essential appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
A decorative image displaying the Backblaze and Quantum logos.

How much do you think it costs your creative team to manage data? How much time and money is spent organizing files, searching for files, and maybe never finding those files? Have you ever quantified it? One market research firm has. According to GISTICS, a creative team of eight people wastes more than $85,000 per year searching for and managing files—and larger teams waste even more than that.

Creative teams need better tools to work smarter. Backblaze has partnered with Quantum to simplify media workflows, provide easy remote collaboration, and free up on-premises storage space with seamless content archiving to the cloud. The partnership provides teams the tools needed to compete. Read on to learn more.

What Is Quantum?

Quantum is a data storage company that provides technology, software, and services to help companies make video and other unstructured data smarter—so data works for them and not the other way around. Quantum’s H4000 Essential (H4000E) asset management and shared storage solution offers customers an all-in-one appliance that integrates automatic content indexing, search, discovery, and collaboration. It couples the CatDV asset management platform with Quantum’s StorNext 7 shared storage.

How Does This Partnership Benefit Joint Customers?

By pairing Quantum H4000 Essential with Backblaze B2 Cloud Storage, you get award-winning asset management and shared storage with the ability to archive to the cloud. The partnership provides a number of benefits:

  • Better organization: Creative teams work visually, and the Quantum platform supports visual workflows. All content is available in one place, with automatic content indexing, metadata tagging, and proxy generation.
  • Searchable assets: All content and projects are searchable in an easy to use visual catalog.
  • Seamless collaboration: Teams can use production tools like Adobe Premiere Pro, Final Cut Pro X, and others to work on shared projects as well as tagging, markup, versioning, chat, and approval tools to streamline collaboration.
  • Robust archive management: Archived content can be restored easily from Backblaze B2 to CatDV to keep work in progress lean and focused.
  • On-premises efficiency: Once projects are complete, they can be quickly archived to the cloud to free up storage space on the H4000E for high-resolution production files and ongoing projects.
  • Simplified billing: Data is stored on always-hot storage, eliminating the management frustration that comes with multiple tiers and variable costs for egress and API calls.

Purchase Cloud Capacity the Same Way You Purchase On-Premises

With Backblaze B2 Reserve, you can purchase capacity-based storage starting at 20TB to pair with your Quantum H4000E if you prefer a predictable cloud spend versus consumption-based billing. Key features of B2 Reserve include:

  • Free egress up to the amount of storage purchased per month.
  • Free transaction calls.
  • Enhanced migration services.
  • No delete penalties.
  • Upgraded Tera support.

Who Would Benefit From Backblaze B2 + Quantum H4000E?

The partnership benefits any team that handles large amounts of data, specifically media files. The solution can help teams with:

  • Simplifying media workflows.
  • Easing remote project management and collaboration.
  • Cloud tiering.
  • Extending on-premises storage.
  • Implementing a cloud-first strategy.
  • Backup and disaster recovery planning.
  • Ransomware protection.
  • Managing consistent data growth.

Getting Started With Backblaze B2 and Quantum H4000E

The Quantum H4000E is a highly-integrated solution for collaborative shared storage and asset management. Configured with Backblaze B2 for content archiving and retrieval, it provides new menu options to perform cloud archiving and move, copy, and restore content, freeing up H4000 local storage for high-resolution files. You can easily add on cloud storage to improve remote media workflows, content collaboration, media asset protection, and archive.

With the H4000E, everything you need to get started is in the box, ready to connect to your 10GbE and higher network. And, a simple Backblaze B2 archive plugin connects the solution directly to your Backblaze B2 account.

Simply create a Backblaze account and configure the Backblaze CatDV panel with your credentials.

Join Backblaze at NAB Las Vegas

Join us at NAB to learn more about the Quantum + Backblaze solution. Our booths are neighbors! Schedule a meeting with us for a demo.

The post Work Smarter With Backblaze and Quantum H4000 Essential appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/work-smarter-with-backblaze-and-quantum-h4000-essential/feed/ 0
What’s the Diff: DAM vs. MAM https://www.backblaze.com/blog/whats-the-diff-dam-vs-mam/ https://www.backblaze.com/blog/whats-the-diff-dam-vs-mam/#comments Fri, 01 Jul 2022 16:27:02 +0000 https://www.backblaze.com/blog/?p=87681 Both digital asset management (DAM) and media asset management (MAM) systems make managing and collaborating on media assets much simpler for larger teams. Learn the difference between the two and which one you should use to improve your workflow.

The post What’s the Diff: DAM vs. MAM appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
What's the Diff DAM vs. MAM

On the surface, outlining the difference between digital asset management (DAM) and media asset management (MAM) might seem like splitting hairs. After all, you’re working with digital media, so what’s the difference between focusing on the “digital” vs. focusing on the “media?”

There are plenty of reasons these two terms are often used interchangeably—both exist to give organizations a central repository of digital assets from video and images to text documents. They both help manage those assets from initial raw source files, to finished production, to archive. And they both make managing and collaborating on those files much simpler for larger teams.

So, what’s the difference? Put it this way: Not all DAM systems are MAM systems, but all MAM systemss are DAM systems.

In essence, MAM is just DAM that offers more capability when it comes to video. While DAM can manage video files, it’s more of a general-purpose tool. There are a lot of nuances that get glossed over in the simplified answer, so it’s worth taking a closer look at the differences between them.

What to Expect From Any Asset Manager

Explaining the difference between a DAM system and a MAM system requires a basic understanding of what an asset manager is, so before we begin, a brief primer. The first thing you need to understand is that any given asset a team might want to work with—a video clip, a document, an image—is usually presented by the asset manager as a single item to the user. Behind the scenes, however, it is composed of three elements:

  • The master source file.
  • A thumbnail or proxy that’s displayed.
  • Metadata about the object itself.

And unlike typical files stored on your own computer, the metadata in asset management files is far more robust than just a simple “date modified” or “file size.” It’s a broader set of attributes, including details about the actual content of the file which we will explain in further detail later on. So, with all of that said, here are the basics of what an asset manager should offer to teams:

  • Collaboration: Members of content creation teams should all have direct access to assets in the asset management system from their own workstations.
  • Access control: Access to specific assets or groups of assets should be allowed or restricted based on the user’s rights and permission settings. These permissions let you isolate certain files for use by a particular department, or allow external clients to view files without making changes.
  • Browse: Assets should be easily identifiable by more than their file name, such as thumbnails or proxies for videos, and browsable in the asset manager’s graphical interface.
  • Metadata search: Assets should be searchable by the attributes used to describe them in the file’s metadata. Metadata assignment capabilities should be flexible and extensible over time.
  • Preview: For larger or archived assets, a preview or quick review capability should be provided, such as playing video proxies or mouse-over zoom for thumbnails.
  • Versions: Based on permissions, team members should be able to add new versions of existing assets or add new assets so that material can be easily repurposed for future projects.

Why Metadata Matters So Much

Metadata matters because it is essentially the biggest difference between organizing content in an asset manager and just chucking it in a folder somewhere. Sure, there are ways to organize files without metadata, but it usually results in letter salad file names like 20190118-gbudman-broll-01-lv-0001.mp4, which strings together a shoot date, subject, camera number, clip number, and who knows what else. Structured file naming might be a “good enough for government work” fix, but it doesn’t scale easily to larger teams of contributors and creators. And metadata is not used only to search for assets, it can be fed into other workflow applications integrated with the asset manager for use there.

If you’re working with images and video (which you probably are if you’re using an asset manager) then metadata is vital. Because unlike text-based documents, images and video can’t be searched for keywords. Metadata can describe in detail what’s in the image or video. In the example below, we see a video of a BMW M635CSi which has been tagged with metadata like “car,” “vehicle,” and “driving” to help it be more easily searchable. If you look further down in the metadata, you’ll see where tags have been added to describe elements at precise moments or ranges of time in the video, known as timecodes. That way, someone searching for a particular moment within the video will be able to hone in on the exact segment they need with a simple search of the asset manager.

iconik MAM
iconik MAM.

Workflow Integration and Archive Support

Whether you’re using a DAM system or a MAM system, the more robust it is in terms of features, the more efficient it is going to make your workflow. These are the features that simplify every step of the process including features for editorial review, automated metadata extraction (e.g., transcription or facial recognition), multilingual support, automated transcode, and much more. This is where different asset management solutions diverge the most and show their customization for a particular type of workflow or industry.

Maybe you need all of these flashy features for your unique set of needs, maybe you don’t. But you should know that over time, any content library is going to grow to the point where at the bare minimum, you’re going to need storage management features, starting with archiving.

Archiving completed projects and assets that are infrequently used can conserve disk space on your server by moving them off to less expensive storage, such as cloud storage or digital tape. Images and video are infamous for hogging storage, a reputation which has only become more pronounced as resolution has increased, making these files balloon in size. Regular archiving can keep costs down and keep you from having to upgrade your expensive storage server every year.

Refresher: What’s the Difference Between Archive and Backup for Media Teams?

Archiving saves space by moving large files out of the asset management system and into a separate archive, but how exactly is that different from the backups you’re already (hopefully) creating? As we’ve outlined before, a backup exists to aid in recovery of files in the event of hardware failure or data corruption, while archiving is a way to better manage file storage and create long-term file retention.

Ideally, you should be doing both, as they serve far different purposes.

While there are a slew of different features that vary between asset managers, integrated automatic archiving might be one of the most important to look for. Asset managers with this feature will let you access these files from the graphical interface just like any other file in its system. After archiving, the thumbnails or proxies of the archived assets continue to appear as before, with a visual indication that they have been archived (like a graphic callout on the thumbnail—think of the notification widget letting you know you have an email). Users can retrieve the asset as before, albeit with some time delay that depends on the archive storage and network connection chosen.

A good asset manager will offer multiple choices for archive storage—from cloud storage, to LTO tape, to inexpensive disk—and from different vendors. An excellent one will let you automatically make multiple copies to different archive storage for added data protection.

Hybrid Cloud Workflows for Media Teams

Obviously, if you’re reading this it’s because you’ve been looking into asset management solutions for a large team, often working remotely. Which means you have a highly complicated workflow that dominates your day-to-day life. Which means you might have questions well outside the scope of what separates DAM from MAM.

You can read up here on the various ways a hybrid cloud workflow might benefit you, regardless of what kind of asset manager you choose.

What Is MAM?

With all of that said, we can now answer the question you came here asking: What is the difference between DAM and MAM?

While they have much in common, the crucial difference is that MAM systems are designed from the ground up for video production. There is some crossover—DAM systems can generally manage video assets, and MAM systems can manage images and documents—but MAM systems offer more tools for video production and are geared towards the particular needs of a video workflow. That means metadata creation and management, application integrations, and workflow orchestration are all video-oriented.

Both, for example, will be able to track a photo or video from the metadata created the moment that content is captured, e.g., data about the camera, the settings, and the few notes the photographer or videographer will add after. But a MAM system will allow you to add more detailed metadata to make that photo or video more easily searchable. Nearly all MAM systems offer some type of manual logging to create timecode-based metadata. MAM systems built for live broadcast events like sports provide shortcut buttons for key events, such as a face-off or slap shot in a hockey game.

More advanced systems offer additional tools for automated metadata extraction. For example, some will use facial recognition to automatically identify actors or public figures.

You can even add metadata that shows where that asset has been used, how many times it has been used, and what sorts of edits have been made to it. There’s no end to what you can describe and categorize with metadata. Defining it for a content library of any reasonable size can be a major undertaking.

MAM Systems Integrate Video Production Applications

Another huge difference between a DAM system and a MAM system, particularly for those working with video, is that a MAM system will integrate tools built specifically for video production. These widely ranging integrated applications include ingest tools, video editing suites, visual effects, graphics tools, transcode, quality assurance, file transport, specific distribution systems, and much more.

Modern MAM solutions integrate cloud storage throughout the workflow, and not just for archive, but also for creating content through proxy editing. Proxy editing gives editors a greater amount of flexibility by letting them work on a lower-resolution copy of the video stored locally. When the final cut is rendered, those edits will be applied to the full-resolution version stored in the cloud.

MAM Systems May Be Tailored for Specific Industry Niches and Workflows

To sum up, the longer explanation for DAM vs. MAM is that MAM focuses on video production, with better MAM systems offering all the integrations needed for complex video workflows. And because specific niches within the industry have wildly different needs and workflows, you’ll find MAM systems that are tailored specifically for sports, film, news, and more. The size of the organization or team matters, too. To stay within budget, a small postproduction house might want to choose a more affordable MAM system that lacks some of the more advanced features they wouldn’t need anyway.

This wide variety of needs is a large part of the reason there are so many MAM systems on the market, and why choosing one can be a daunting task with a long evaluation process. Despite the length of that process, it’s actually fairly common for a group to migrate from one asset manager to another as their needs shift.

Pro tip: Working with a trusted system integrator that serves your industry niche can save you a lot of heartache and money in the long run.

It’s worth noting that, for legacy reasons, sometimes what’s marketed as a DAM system will have all the video capabilities you’d expect from a MAM system. So, don’t let the name throw you off. Whether it’s billed as MAM or DAM, look for a solution that fits your workflow with the features and integrated tools you need today, while also providing the flexibility you need as your business changes in the future.

If you’re interested in learning how you can make your cloud-based workflow more efficient (and you should be) check out our comprehensive e-book outlining how to optimize your workflow.

FAQs About Differences Between DAM and MAM

What is the difference between MAM and DAM?

MAM stands for Media Asset Management, while DAM stands for Digital Asset Management. Although the terms are often used interchangeably, MAM offers more tools specific to video options. This is important because video files are almost always much larger files than other digital assets.. Basically, MAM is a sub-category of DAM.

What is a MAM?

A MAM, which stands for Media Asset Management, manages and distributes very large media files. They were originally used as management solutions for TV and film, as these industries frequently need to store, transfer, and edit high-quality video files that are anywhere between 20 minutes to several hours. As internet content has become more visual and more video-focused, these systems have become more and more widespread and in demand.

What is a DAM?

A DAM, which stands for Digital Asset Management, allows you to edit and manage all types of digital files, including options for resizing and reformatting, as well as sharing of large files. However, it’s also important to remember that today, many programs marketed as DAM systems will actually have all the capabilities you might expect from a MAM program (plus a few extras you might not need in MAM-specific workflows).

Why use a MAM?

A MAM is the tool you need if your team specializes in creating long-form video and audio files. It’s been built from the ground up to help you manage video products specifically, which means that if your main focus is on video, a MAM is often the best choice.

Why use a DAM?

A DAM can be helpful if you’re looking to manage a number of media types that are not video. They can be very helpful in logically organizing large amounts of content, as well as strange file types that may defy other types of asset management.

The post What’s the Diff: DAM vs. MAM appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/whats-the-diff-dam-vs-mam/feed/ 4
How Backup and Archive Are Different for Professional Media Workflows https://www.backblaze.com/blog/backup-vs-archive-professional-media-production/ https://www.backblaze.com/blog/backup-vs-archive-professional-media-production/#comments Fri, 17 Jun 2022 16:07:59 +0000 https://www.backblaze.com/blog/?p=88300 Learn about the difference between backing up and archiving for media teams, with a real world application from UCSC Silicon Valley.

The post How Backup and Archive Are Different for Professional Media Workflows appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>

When to back up and when to archive? It’s a simple question, and the simple answer is that it depends on the function of the data you’re looking to archive or back up. For media teams, a solid understanding of how your data functions, how often you need it, and how fast you need it is required in order to implement the right tools.

In this post, we’ll explain the difference between backing up and archiving for media teams, and we’ll walk through a real-world application from UCSC Silicon Valley.

Backup vs. Archive: A Refresher

We explored the broader topic of backing up vs. archiving in our “What’s the Diff: Backup vs. Archive” post. In short, you should use a backup if you intend to keep the data available in case of loss. If you make a copy for regulatory compliance, or to move older, less-used data off to cheaper storage, you should use an archive. Simple, right? Not always, if you’re talking about image, video, and other media files. Read on to learn more.

Backup vs. Archive for Professional Media Workflows

Definitions of backup and archive that apply to general business use cases don’t always apply to professional media workflows. Video and image files differ from typical business data in a number of ways, and that profoundly impacts how they’re protected and preserved throughout their lifecycle.

When backing up media files, there are key differences between which files get backed up and how they get backed up. When archiving media files, there are key differences between when files get archived and why they’re archived. The main differences between business files and media workflow files include:

  • Size: Media files are much larger and more intermediate files are generated through the production process.
  • Archive use case: Media teams archive to save space on their on-premises production storage, while most businesses archive to meet compliance requirements.
  • Archive timing: Media teams will frequently archive source files immediately upon ingestion in addition to final cuts, whereas only final versions need to be archived in business use cases.

We’ll explain each of these elements in more detail below.

Large Media File Sizes Slow Down Backups

The most obvious difference is that media files are BIG. Most business documents are under 30MB in size, yet even a single second of video could be larger than 30MB depending on the resolution and frame rate. In a typical business use case, a company might plan to back up files overnight, say for incremental backups, or over a weekend for a full backup. But backing up large media files might exceed those windows. And you can’t expect deduplication to shorten backup times or reduce backup sizes, either. Video and images don’t dedupe well.

Furthermore, the editing process generates a flurry of intermediate or temporary files in the active content creation workspace that don’t need to be backed up because they can be easily regenerated from source files.

The best backup solutions for media allow you to specify exactly which directories and file types you want backed up, so that you’re taking time for and paying for only what you need.

Archiving to Save Space on Production Storage

Media teams tend to use archiving to reduce production storage costs, whereas businesses are much more likely to use archives for compliance purposes. High-resolution video editing, for example, requires expensive, high-performance storage to deliver multiple streams of content to multiple users simultaneously without dropping frames. Since high-resolution files are so large, this expensive storage resource fills up quickly. Once a project is complete, most media teams prefer to clear space for the next project. Archiving completed projects and infrequently-used assets can keep production storage capacities under control.

Media asset managers (MAMs) can simplify the archive, retrieval, and distribution process. Assets can be archived directly through the MAM’s user interface, and after archiving, thumbnails or proxies remain visible to users. Archived content remains fully searchable by its metadata and can also be retrieved directly through the MAM interface. For more information on MAMs, read “What’s the Diff: DAM vs. MAM.”

Media teams can manage budgets effectively by strategically archiving select media files to less expensive storage. Content is readily accessible should it be needed for redistribution, repurposing, and monetization, especially when archiving is done properly.

Permanently Secure Source Files and Raw Footage on Ingest

A less obvious way that media workflows are different from business workflows is that video files are fixed content that are not actually altered during the editing process. Instead, editing suites compile changes to be made to the original and apply the changes only when making the final cut and format for delivery. Since these source files are often irreplaceable, many facilities save a copy to secondary storage immediately as soon as they’re ingested to the workflow. This copy serves as a backup to the file on local storage during the editing process. Later, when the local copy is no longer actively being used, it can be safely deleted knowing it’s secured in the archive. I mean backup. Wait, which is it?

Whether you call it archive or backup, make a copy of source files in a storage location that lives forever and is accessible for repurposing throughout your workflow.

To see how all this works in the real world, here’s how UCSC Silicon Valley designed a new solution that integrates backup, archive, and asset management with Backblaze B2 Cloud Storage so that their media is protected, preserved, and organized at every step of their workflow.

Still from UC Scout AP Psychology video
Still from UC Scout AP Psychology video.

How UCSC Silicon Valley Secured Their Workflow’s Data

UCSC Silicon Valley built a greenfield video production workflow to support UC Scout, a University of California online learning program that gives high school students access to the advanced courses they need to be eligible and competitive for college. Three teams of editors, producers, graphic designers, and animation artists—a total of 22 creative professionals—needed to share files and collaborate effectively, and Digital Asset Manager, Sara Brylowski, was tasked with building and managing their workflow.

Brylowski and her team had specific requirements:

  • For backup, they needed to protect active files on their media server with an automated backup solution that allowed accidentally deleted files to be easily restored.
  • To manage storage capacity more effectively on their media server, they wanted to archive completed videos and other assets that they didn’t expect to need immediately.
  • To organize content, they needed an asset manager with seamless archive capabilities, including fast self-service archive retrieval.

They wanted the reliability and simplicity of the cloud to store both their backup and archive data. “We had no interest in using LTO tape for backup or archive. Tape would ultimately require more work and the media would degrade. We wanted something more hands off and reliable,” Brylowski explained.

They chose Backblaze B2 Cloud Storage along with a Facilis media storage system and CatDV media asset management software.

The solution delivered results quickly. Production team members could fully focus on creating content without concern for storage challenges. Retrievals and restores, as needed, became a breeze. Meanwhile, UCSC IT staff were freed from wrestling gnarly video data. And the whole setup helped Brylowski bring UC Scout’s off-premises storage costs under control as she plans for significant content growth ahead.

“With our new workflow, we can manage our content within its life cycle and at the same time, have reliable backup storage for the items we know we’re going to want in the future. That’s allowed us to concentrate on creating videos, not managing storage.”
—Sara Brylowski, UCSC Silicon Valley

To find out exactly how Brylowski and her team solved their challenges and more, read the full case study on UC Scout at UCSC Silicon Valley and learn how their new workflow enables them to concentrate on creating videos, not managing storage.

Looking for storage to fit your backup or archive workflows? Backblaze B2 Cloud Storage is simple to use, always active, and workflow friendly.

The post How Backup and Archive Are Different for Professional Media Workflows appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

]]>
https://www.backblaze.com/blog/backup-vs-archive-professional-media-production/feed/ 3