Presentation: javamug-chatgpt.pdf

It’s inescapable. The capabilities that ChatGPT and Large Language Models provide have become discussion topics on the news, in social gatherings, online, at work. Things that would have seemed impossible a few years ago are now nearly pedestrian in how ubiquitous they are becoming on a daily basis. While they show very well, very few people actually understand what is going on, and worse, what is or isn’t possible.

Additionally, there are concerns about the costs involved, the security risks, and the inherent latencies of cloud-based systems. There are additional costs that are not factored into model deployments that we need to consider as well. How do the trends in IT of increased parallelization, heterogeneity, and distributed systems impact the use of these models?