•   2 months ago

LLM Not Detected in New Hackathon Elastic Cloud Deployment (Trial Expired / Need LLM Access)

Hi Elastic Team,

I registered for Elastic Cloud using the official hackathon signup link (cta=hackathon / Devpost). In my earlier deployment, Agent Builder was working properly and OpenAI/Claude LLM was already connected, so I was able to build and test my agents successfully.

Now that deployment/trial has expired. I created a new deployment using another email, but in Agent Builder it shows “No Large Language Model detected”. The UI shows LLM options, but manual connection requires an external API key with billing/credits.

I am currently building multiple agents for the hackathon (Data Profiling, Data Masking, Data Generation, and Explainable Duplicate Decision Agent) and need LLM connectivity to continue development and testing.

Could you please help by enabling the default hackathon LLM connector for my new deployment or extending the previous deployment access?

Thanks,
Praneeth
Elastic Hackathon Participant

  • 7 comments

  • Manager   •   2 months ago

    The CTA shouldn't really make a difference (just for us to know the source of the traffic). Both Serverless and Cloud Hosted should have the default LLMs at this point. What do you get if you run GET /_inference in the Console of Dev Tools? Does it list the LLMs for you?

  •   •   2 months ago

    Hi Philipp, thanks! I ran GET /_inference in Dev Tools. It lists the following endpoints: .elser-2-elasticsearch (sparse_embedding), .multilingual-e5-small-elasticsearch (text_embedding), and .rerank-v1-elasticsearch (rerank). I don’t see any endpoint for completion / chat_completion / text_generation. Agent Builder UI still shows “No Large Language Model detected”. Is there a way to enable the default chat LLM inference endpoint for this hackathon deployment?

  •   •   2 months ago

    Hi Philip, i got it working In Cloud Hosted, only inference endpoints for embeddings/rerank were available, so Agent Builder couldn’t detect a chat LLM. But in Elastic Cloud Serverless, default managed LLMs like OpenAI/Claude/Gemini are already enabled, so agents work directly without manual API setup

  • Manager   •   2 months ago

    For Cloud Hosted, did you pick the latest version 9.3? And which provider and region? I have the LLMs in Cloud Hosted and they should be available for all trials. But let me double check.

    And in any case, good if it's working on Serverless — that's the easiest place for getting all of this done anyway :)

  •   •   2 months ago

    Hey, thanks for checking!
    Yes, I selected Cloud Hosted and chose the latest version (9.3).
    Provider: GCP
    Region: US Central 1 (Iowa)

    But in Kibana → AI Assistant / Agent Chat, it still shows “No Large Language Model detected” and asks me to Connect LLM.

    Since it’s working fine on Serverless, I’ve started building my project there for now.

    Please let me know if Cloud Hosted trials in this region should have LLMs enabled, or if there’s any setting/flag I need to turn on.

  • Manager   •   2 months ago

    To close the loop on this: The LLMs should be there. This looks like a bug. We'll take a look at it.

  •   •   2 months ago

    Thanks Philipp! Appreciate you confirming.
    I’ll continue building on Serverless for now since it’s working smoothly.
    Please let me know if you need any logs/screenshots from my side to help debug the Cloud Hosted issue

Log in or sign up for Devpost to join the conversation.