Thanks for the info. Is a request equal to asking one question of the bot?Sample usage for March 5:
View attachment 282557
ChatGPT Bot is only enabled on less than five forums out of more than 200 nodes.
16K+ users log in daily.
Yeah I checked the demo and it looks like the API knowledge cutoff date is the same as the standard. Was hoping it was up to date because I wanted to use it to generate quizzes for my football forum but it thinks our club manager is the same guy who was sacked 3 managers ago.You can sort of check, ask ChatGPT versions what the latest nginx version is or some known software and it usually lists the date of the version. ChatGPT free GUI was around October 2021 when it first launched, but once ChatGPT Plus was launched, free and paid is around September 2021 for GUI at least. No idea on API version.
Looks like it.Thanks for the info. Is a request equal to asking one question of the bot?
80 cents for 300 questions. Could get expensive quickly unless you figured out a way to monetize the feature on your forum.Looks like it.
I don't know about this addon, but his other addon, you can set per usergroup as a whole, or combinations of per usergroup per node.I see that permissions are granted per node, does this mean I do not have the option to grant permissions to certain usergroups?
Yup, totally understand. Even with the tiny usage of our moderators testing it out (the other addon), I can see how tons of users leveraging it every day could add up.Yeah % of messages might not be as high but all relative. Just being wary of bill shock![]()
Ya, I guess "expensive" is very relative compared to the value it provides for the users/community. If the other addon proves to be "valuable" to our community (users posting the content and receivers of the content) I could see it being a value-add to our paid-memberships as a new extra feature for them.Could get expensive quickly unless you figured out a way to monetize the feature on your forum.
Yeah you're right, I can see the utility of this addon, that's why I'm hoping it can be restricted for use by certain usergroups - i.e. paying members.I don't know about this addon, but his other addon, you can set per usergroup as a whole, or combinations of per usergroup per node.
Yup, totally understand. Even with the tiny usage of our moderators testing it out (the other addon), I can see how tons of users leveraging it every day could add up.
Ya, I guess "expensive" is very relative compared to the value it provides for the users/community. If the other addon proves to be "valuable" to our community (users posting the content and receivers of the content) I could see it being a value-add to our paid-memberships as a new extra feature for them.
As you can set the forum/s where the big replies, presumably you only allow paying upgraded users to those forums.Yeah you're right, I can see the utility of this addon, that's why I'm hoping it can be restricted for use by certain usergroups - i.e. paying members.
On my forum currently we have ads on the forum and members that upgrade to paid membership have the ads removed along with a few other perks such as post edit permissions etcAs you can set the forum/s where the big replies, presumably you only allow paying upgraded users to those forums.
I don't think I would want to have a separate forum just for use of this addon but if that's the only option then I suppose I would give it a try.
that's why I'm hoping it can be restricted for use by certain usergroups
You’ve done an amazing job.Oh my god gentlemen, I'm amazed how active this thread was tonight
Unfortunately, I can't read all of your posts as I'm working on an update.
I see a lot of people wanting to get the bot to respond to mention and the ability to turn off AI in topics. Both of these features are accepted and will appear in the add-on in the future. To avoid further duplicate feature requests and structure all requests, I created a special forum section. So now if you want to make a suggestion regarding my add-ons, I will ask you to do so exclusively in this section.
If you contacted me with an individual question here, please do so again at devsell.io
Love <3
Thanks, I'll probably end up doing that if there's no way to allow access via usergroup permissions.I that was what you implied by this:
If you had such a forum where your paid members can post threads, but others can only view, then you may get mow upgrades if people see the value of the bot (be it useful info or just fun)
I have a dedicated forum. I think it would annoy members to have the bot answering everywhere.
LOL, you clearly struck while the AI iron is red-hot right now!Oh my god gentlemen, I'm amazed how active this thread was tonight![]()
BRILLIANT! Can't wait to test it!I see a lot of people wanting to get the bot to respond to mention
Very cool / smart idea!I created a special forum section. So now if you want to make a suggestion regarding my add-ons, I will ask you to do so exclusively in this section.
If you contacted me with an individual question here, please do so again at devsell.io
You can ask questions about the add-on here, in the thread on my forum or in private messages.Where would you like us to post ideas that aren't yet fleshed-out into "suggestions"? For example:
This decision should be made solely by you, depending on the specific circumstances. This question is not directly related to the add-on, it is more about how your forum should be organized. Everything is simple here: if you have reasons to impose restrictions, you should introduce restrictions, if not, you should not.I'm wondering if there's a reason to limit the quantity or velocity of usage per-member? OpenAi has this already built into ChatGPT (I've received the "too many in one hour" response before on the free version)... and I'm curious if something like that makes sense here?
This version requires the installed add-on [021] ChatGPT Framework 1.2.0+
Added the "Max responses per thread" option, which will allow you to limit the number of bot responses in one topic.
Values for "Temperature" option are limited to values from 0 to 1 with 0.1 step.
The bot's context is limited to the last 10 posts before the post it is replying to. This is done to avoid the token limit in long threads.
Now the error logs will contains a response from OpenAI, which will allow...
We use essential cookies to make this site work, and optional cookies to enhance your experience.