Troubleshoot LoganAI Issues
Disclaimer: Validity of AI Responses
Topics:
LoganAI Administration
Issue | Resolution |
---|---|
I don't see any regions under Generative AI service region |
LoganAI uses OCI Generative AI service to provide the AI functionality. You will see a list of regions that have Generative AI service that your tenancy is subscribed to. If you don't see any regions in this list, verify if your tenancy is subscribed to one of the regions where Generative AI is available. |
Generative AI Connection Test failed |
Following are some of the reasons Test from the Administration UI can fail:
|
LoganAI Explain
Issue | Resolution |
---|---|
I don't see the LoganAI icon | Make sure LoganAI is enabled in your tenancy. See Enable LoganAI. |
Why do I keep seeing a message to accept the Generative AI license? | Your administrator has an option of accepting the Generative AI licenses on behalf of all the users, or let the user accept once, or every time the feature is used in a session. If you are seeing this message multiple times, then it means your administrator has enabled the option for users to accept the license once per session. Currently, the length of a session window is one hour. |
Where do I accept the Generative AI license? | See Accept the Licenses. |
I only see Generating ... after clicking the AI icon |
LoganAI will streams the results from the OCI Generative service as soon as the AI generates a response. The Generating... animation is displayed until a response is received. There are few reasons why you may not see a response immediately:
Reload the page and try with a smaller data set. This can be done by staying in the first page and clicking the AI icon for all logs, so only one page of logs are sent for summary. If this still does not work, then it could mean your log records are too large for AI processing. You can try to change your log source and try with smaller log files to test. |
Error processing Generative AI request. Internal Server Error | This usually indicates the input data size for summarization is very large. See the previous issue for more details. |
This model's maximum context length is X tokens. However, you requested Y tokens. |
As the message indicates, the input data to AI is larger than the allowed context length. Following are some of the ways to reduce the size of the input data:
|