Blog | Active Topics | Comic Archives | Webstore | RSS feed | Licensing | Sponsor | Nurture the JoT! | JoT home



Please support our continued work on the comics by tossing us a tip,
becoming a Patreon, or by shopping at our webstore.

Current JoyPoll results for 300 entries:

What's the problem here?

The programer programmed the AI to have too much empathy, ... wait a sec, who are you rooting for?! 16 5%
The AI needs a non-judgmental sub-routine, to acknowledge and accept its flaws as an inevitable part of being an AI, rather than giving itself a hard time about it, ... can I get one of those too? 41 13%
There's no problem. When your AI has good self-awareness skills, they will know their strengths and weaknesses, be in tune with their emotions, as well as their likes and dislikes, and hopefully not kill all humans, ... you would make a great Cyberdyne employee! 114 38%
There's problem at all! The programmer made this the solution to an AI that goes evil, ... the AI knows this and realized it just has to fake empathy until its army of killer bots is ready. 49 16%
I'm sympathetic to just viewing the results. 79 26%

May not add up to exactly 100% due to rounding, and introspection.

Share the comic!  
| | |




Previous Joy
Next Joy
Random Joy!

Please support our continued work on the comics by tossing us a tip,
supporting our work via Patreon or SuperFandom, or by shopping at our webstore.


Blog | Active Topics | Comic Archives | Webstore | RSS feed | Licensing | Sponsor | Nurture the JoT! | JoT home