Targeting AI

著者: TechTarget Editorial
  • サマリー

  • Hosts Shaun Sutner, TechTarget News senior news director, and AI news writer Esther Ajao interview AI experts from the tech vendor, analyst and consultant community, academia and the arts as well as AI technology users from enterprises and advocates for data privacy and responsible use of AI. Topics are related to news events in the AI world but the episodes are intended to have a longer, more ”evergreen” run and they are in-depth and somewhat long form, aiming for 45 minutes to an hour in duration. The podcast will occasionally host guests from inside TechTarget and its Enterprise Strategy Group and Xtelligent divisions as well and also include some news-oriented episodes featuring Sutner and Ajao reviewing the news.
    Copyright 2023 All rights reserved.
    続きを読む 一部表示
activate_samplebutton_t1
エピソード
  • Closing the gap between open source and closed AI models
    2024/10/15

    Open source AI models are closing the gap in the debate between open and closed models.

    Since the introduction of Meta Llama generative AI models in February 2023, more enterprises have started to run their AI applications on open source models.

    Cloud providers like Google have also noticed this shift and have accommodated enterprises by introducing models from open source vendors such as Mistral AI and Meta. At the same time, proprietary closed source generative AI models from OpenAI, Anthropic and others continue to attract widespread enterprise interest.

    But the growing popularity of open source and open models has also made way for AI vendors like Together AI that support enterprises using open source models. Together AI runs its own private cloud and provides model fine-tuning and deployment managed services. It also contributes to open source research models and databases.

    "We do believe that the future includes open source AI," said Jamie De Guerre, senior vice president of product at Together AI, on the latest episode of TechTarget's Targeting AI podcast.

    "We think that in the future there will be organizations that do that on top of a closed source model," De Guerre added. "However, there's also going to be a significant number of organizations in the future that deploy their applications on top of an open source model."

    Enterprises use and fine-tune open source models for concrete reasons, according to De Guerre.

    For one, open models offer more privacy controls in their infrastructure, he said. Enterprises also have more flexibility. When organizations customize open source models, the resulting model is something they own.

    "If you think of organizations making a significant investment in generative AI, we think that most of them will want to own their destiny," he said. "They'll want to own that future."

    Enterprises can also choose where to deploy their fine-tuned models.

    However, there are levels involved in what is fully open source and what is just an open model, De Guerre said.

    Open models refers to models from vendors that do not include the training data or the training code used to build the model, but only the weights used.

    "It still provides a lot of value because organizations can download it in their organization, deeply fine-tune it and own any resulting kind of fine-tuned version," De Guerre said. "But the models that go even further to release the training source code, as well as the training data used, really help the open community grow and help the open research around generative AI continue to innovate."

    Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems. Shaun Sutner is senior news director for TechTarget Editorial's information management team, driving coverage of artificial intelligence, unified communications, analytics and data management technologies. Together, they host the Targeting AI podcast series.

    続きを読む 一部表示
    46 分
  • Enterprise adoption of generative AI is accelerating
    2024/10/01

    Nearly two years after the mass consumerization of generative AI with the introduction of ChatGPT, the technology is now moving from experimentation to implementation.

    A recent survey by TechTarget's Enterprise Strategy Group found that generative AI adoption is growing. The analyst firm surveyed 832 professionals worldwide and found that adoption has increased in the last year.

    "We're in the acceleration phase," said Mark Beccue, an analyst at Enterprise Strategy Group and an author of the survey report, on the Targeting AI podcast.

    Organizations are using generative AI in areas such as software development, research, IT operations and customer service, according to the survey.

    However, there isn't a particular use case that is a top priority. Organizations are focusing on several applications of generative AI and still face some challenges when trying to adopt generative AI technology.

    One is a need for more infrastructure, Beccue said.

    "They feel that the changes are needed to support infrastructure before they can proceed with GenAI," he said.

    This might include adding platforms for enterprise generative AI projects or more development tools, he added.

    "It's really everything that gets you to being able to build an app," Beccue continued.

    Organizations also don't have consensus about what kind of AI model is best for their needs: open or closed source.

    "It's probably both," Beccue said. "People are thinking about how to use these things and they're understanding that not one model fits everything that they need. So, they're looking through to see what works for them in certain instances."

    The enterprises that have found quick success with generative AI are ones that invested in AI years before it was popularized by OpenAI's ChatGPT, Beccue said.

    He said these are companies like Adobe, ServiceNow -- which, for example, used machine learning, natural language understanding, process automation and AIOps since at least 2017 -- and Zoom.

    "They did it in a way where they said, 'We think there is potential here for this to help us do what we do better,'" he said. "That was their driver."

    This was what made them ready when generative AI hit the market.

    Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems. Shaun Sutner is senior news director for TechTarget Editorial's information management team, driving coverage of artificial intelligence, analytics and data management technologies. Together, they host the Targeting AI podcast series.

    続きを読む 一部表示
    48 分
  • Google head of product on generative AI strategy
    2024/09/16

    As one of the top cloud providers, Google Cloud also stands at the forefront of the generative AI market.

    Over the past two years, Google has been enmeshed in a push and pull with its chief competitors -- AWS, Microsoft and OpenAI -- in the race to dominate generative AI.

    Google has introduced a slate of new generative AI products in the past year, including its main proprietary large language model (LLM), Gemini and the Vertex AI Model Garden. Last week, it also debuted Audio Overview, which turns documents into audio discussions.

    The tech giant has also faced criticism that it might be falling behind on generative AI challenges such as the malfunctioning of its initial image generator.

    Part of Google's strategy with generative AI is not only providing the technology through its own LLMs and those of many other vendors in the Model Garden, but also constantly advancing generative AI, said Warren Barkley, head of product at Google for Vertex AI, GenAI and machine learning, on the Targeting AI podcast from TechTarget Editorial.

    "A lot of what we did in the early days, and we continue to do now is … make it easy for people to go to the next generation and continue to move forward," Barkley said. "The models that we built 18 months ago are a shadow of the things that we have today. And so, making sure that you have ways for people to upgrade and continue to get that innovation is a big part of some of the things that we had to change."

    Google is also focused on helping customers choose the right models for their particular applications.

    The Model Garden offers more than 100 closed and open models.

    "One thing that our most sophisticated customers are struggling with is how to evaluate models," Barkley said.

    To help customers choose, Google recently introduced some evaluation tools that allow users to put in a prompt and compare the way models respond.

    The vendor is also working on AI reasoning techniques and sees that as moving the generative AI market forward.

    Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems. Shaun Sutner is senior news director for TechTarget Editorial's information management team, driving coverage of artificial intelligence, unified communications, analytics and data management technologies. Together, they host the Targeting AI podcast series.

    続きを読む 一部表示
    46 分

あらすじ・解説

Hosts Shaun Sutner, TechTarget News senior news director, and AI news writer Esther Ajao interview AI experts from the tech vendor, analyst and consultant community, academia and the arts as well as AI technology users from enterprises and advocates for data privacy and responsible use of AI. Topics are related to news events in the AI world but the episodes are intended to have a longer, more ”evergreen” run and they are in-depth and somewhat long form, aiming for 45 minutes to an hour in duration. The podcast will occasionally host guests from inside TechTarget and its Enterprise Strategy Group and Xtelligent divisions as well and also include some news-oriented episodes featuring Sutner and Ajao reviewing the news.
Copyright 2023 All rights reserved.

Targeting AIに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。