Skip Levens is a product leader and AI strategist at Quantum, a frontrunner in data management solutions for AI and unstructured data. He is currently liable for driving engagement, awareness, and growth for Quantum’s end-to-end solutions. Throughout his profession – which has included stops at organizations like Apple, Backblaze, Symply, and Energetic Storage – he has successfully led marketing and business development, evangelism, launched recent products, built relationships with key stakeholders, and driven revenue growth.
Quantum provides end-to-end data solutions that help organizations manage, enrich, and protect unstructured data, comparable to video and audio files, at scale. Their technology focuses on transforming data into helpful insights, enabling businesses to extract value and make informed decisions. Quantum’s platform offers secure, scalable, and versatile solutions, combining onsite infrastructure with cloud capabilities. The corporate’s approach allows businesses to efficiently handle data growth while ensuring security and adaptability throughout the information lifecycle.
Are you able to provide an summary of Quantum’s approach to AI-driven data management for unstructured data?
By helping customers integrate artificial intelligence (AI) and machine learning (ML) into their key business operations, Quantum helps customers to effectively manage and unlock meaningful value from their unstructured data, creating actionable business insights that lead to higher business decisions. By constructing their very own AI/ML tools, corporations can move from simply coping with the influx of information and content, to leveraging insights as a brand new driver of efficiencies and ultimately amplifies human expertise in all phases of business operations.
How does Quantum’s AI technology analyze unstructured data, and what are some key innovations that set your platform aside from competitors?
Within the initial stages of adopting AI/ML tools, many organizations find their workflows grow to be disordered and disconnected, and may lose track of their data, making it difficult to implement security and protection standards. Too often, early development is hampered by ill-suited storage and file system performance.
We developed Myriad, a high-performance, software-defined file storage and intelligent fabric environment to elegantly meet the challenges of integrating AI/ML pipeline and high-performance workflows together – unifying workflows without the hardware constraints and limitations of other systems. Myriad is a transparent departure from legacy hardware and storage constraints, and built with the newest storage and cloud technologies, is entirely microservices driven and orchestrated by Kubernetes to be a highly responsive system that rarely requires admin interaction. Myriad is exclusively architected to attract the very best performance from NVMe and intelligent fabric networking and near instantaneous distant direct memory access (RDMA) connections between every component. The result’s an revolutionary system that responds intelligently and routinely to changes and requires minimal admin intervention to perform common tasks. By making intelligent fabric a part of the system, Myriad can also be an intrinsically load-balanced system that gives multiple 100Gbps ports of bandwidth as a single, balanced IP address.
Pairing Myriad with our cloud-like object storage system, ActiveScale, allows organizations to archive and preserve even the most important data lakes and content. The mix offers customers a real end-to-end data management solution for his or her AI pipelines. Furthermore, when delivered alongside our CatDV solution, customers can tag and catalog data to further enrich their data and prepare it for evaluation and AI.
Could you share insights on the usage of AI with video surveillance on the Paris Olympics, and what other large-scale events or organizations have utilized this technology?
Machine Learning can develop repeatable actions that recognize patterns of interest on video and derive insights from a flood of real-time video data at a scale larger and faster than is feasible by human efforts alone. Video surveillance, for instance, can use AI to capture and flag suspicious behavior because it occurs, even when there are lots of of cameras feeding the model information. A human attempting this task would only find a way to process one event at a time, whereas AI-powered video surveillance can tackle 1000’s of cases concurrently.
One other application is crowd sentiment evaluation, which might track long queues and pinpoint potential frustrations. These are all actions that a security expert can reliably flag, but by utilizing AI/ML systems to repeatedly watch simultaneous feeds, those experts are freed to take appropriate motion when needed, dramatically boosting overall effectiveness and safety.
What are the first challenges organizations face when implementing AI for unstructured data evaluation, and the way does Quantum help mitigate these challenges?
Organizations must completely reimagine their approach to storage, in addition to data and content management as a complete. Most organizations grow their storage capabilities organically, often in response to one-off needs, and this creates multi-vendor confusion and unlucky complexity.
With the adoption of AI, organizations must now simplify the storage that underpins their operations. Oftentimes, this requires implementing a “hot” a part of the initial data ingest, or landing zone where applications and users can work as fast as possible. Then, a big “cold” kind of storage is added that may easily archive massive amounts of information and protect it in an economical way, with the power to maneuver the information back right into a “hot” processing workflow almost instantaneously.
By reimagining storage into fewer, more compact solutions, the burden on admin staff is way lower. This sort of “hot/cold” data management solution is good for AI/ML workflow integration, and Quantum solutions enable customers create a highly agile, flexible platform that’s concise and simple to administer.
How do Quantum’s AI innovations integrate with other AI-powered tools and technologies to reinforce organizational growth and efficiency?
Many individuals think storage for AI/ML tools is simply about feeding graphics processing units (GPUs), but that’s only one small a part of the equation. Though speed and high-performance could also be instrumental in feeding data as fast as possible to the GPUs which can be performing data evaluation, the larger picture revolves around how a corporation can integrate iterative and ongoing AI/ML development, training, and inference loops based on custom data. Oftentimes the primary and most vital AI/ML task addressed is constructing “knowledge bots” or “counselor bots” using proprietary data to tell internal knowledge employees. To make those knowledge bots useful and unique to every organization, large amounts of specialised information is required to tell the model that trains them. Cue an AI-powered storage solution: if that proprietary data is well-ordered and available in a streamlined storage workflow, it’s going to be far easier to prepare in types, sets, and catalogs of information which is able to, in turn, make sure that those knowledge bots are highly informed on the organization’s unique needs.
Are you able to elaborate on the AI-enabled workflow management features and the way they streamline data processes?
We’re constructing a bunch of AI-enabled workflow management tools that integrate directly into storage solutions to automate tasks and supply helpful real-time insights, enabling fast and informed decision-making across organizations. That is because of recent and advanced data classification and tagging systems that use AI to each organize data and make it easily retrievable, and even perform standard actions on that media comparable to conforming to a certain size, which significantly reduces the manual efforts needed when organizing data into training sets.
Intelligent automation tools manage data movement, backup, and compliance tasks based on set policies, ensuring consistent application, and reducing administrative burdens. Real-time analytics and monitoring also offer immediate insights into data usage patterns and potential issues, routinely maintaining data integrity and quality throughout its entire lifecycle.
What’s the outlook for AI-powered data management, and what trends do you foresee in the approaching years?
As these tools evolve and grow to be multi-modal, it’s going to allow more expressive and open-ended ways of working along with your data. In the longer term, you’ll find a way to have a “conversation” along with your system and be presented with information or analytics of interest comparable to ‘what’s the fastest growing kind of data in my ‘hot zone’ now?’. This level of specialization might be a differentiator for the organizations that construct these tools into their storage solutions, making them more accurate and efficient even when confronted with constant recent streams of evolving data.
What role do your cloud-based analytics and storage-as-a-service offerings play in the general data management strategy?
Organizations with significant and expanding storage requirements often struggle to maintain up with demand, especially when operating on limited budgets. Public cloud storage can result in high and unpredictable costs, making it difficult to accurately estimate and buy years’ value of storage needs prematurely. Many purchasers would really like the general public cloud experience of a known projected operating cost yet eliminate the surprise egress or API charges that public cloud can bring. To reply this need, we developed Quantum GO to offer customers that personal cloud experience with a low initial entry point and low fixed monthly payment options for a real storage-as-a-service experience in their very own facility. As storage requirements increase, Quantum GO gives customers the added advantage of an easy ‘pay-as-you-grow’ subscription model to supply enhanced flexibility and scalability in an economical manner.
How does Quantum plan to remain ahead within the rapidly evolving AI and data management landscape?
In today’s world, being merely a “storage provider” isn’t enough. Newly evolving data and business challenges require an intelligent, AI-empowering data platform that helps customers to maximise the worth of their data. At Quantum, we proceed to innovate and spend money on enhanced capabilities for our customers to assist them easily and effectively work with troves of information throughout their entire lifecycles.
We’re expanding intelligent AI to uplevel the tagging, cataloguing, and organizing of information, making it easier than ever to go looking, find, and analyze it to extract more value and insight. We’ll proceed to reinforce our AI capabilities that assist with automatic video transcription, translating audio and video files into other languages inside seconds, and enabling quick searches across 1000’s of files to discover spoken words or locate specific items, and more.
What advice would you give to organizations just starting their journey with AI and unstructured data management?
AI/ML has had enormous hype, and since of that, it could possibly be difficult to parse out what’s practical and useful. Organizations must first think concerning the data being created, and pinpoint the way it’s being generated, captured, and preserved. Further, organizations must hunt down a storage solution that’s able to access and retrieve data as needed, and one that can help guide each day-to-day workflow and future evolution. Even when it’s hard to agree on what the last word AI goals are, taking steps now to ensure that storage systems and data workflows are streamlined, simplified, and robust pays enormous dividends when integrating current and future AI/ML initiatives. Organizations will then be well-positioned to maintain exploring how these AI/ML tools can advance their mission without worrying about with the ability to properly support it with the proper data management platform.