The Singular sins and blessings of Creators 📜

Intended audience: You may belong to the intended audience of these writings, if ... 👷‍♀️:

In general the target audience of this article is something one may refer to as a content creator.

Content Creators may create something new, original, never seen before 📝. They have sources or inspirations for their work perhaps, but their creativity and specific path of interest comes from nothing but the randomness of space 🌌.

Or is there something else to it? Will-power 👽?

When you do not create things, you become defined by your tastes rather than your ability. Your tastes only narrow & exclude people. So create. ~ Why the lucky stiff.
https; text based article "The Act of Creation" by Andrew stephens

Sins a creator can commit 🧟‍♂️

It is important for a creator to remember he can make mistakes.

Listen to criticism, do not be afraid to apply meta-criticism back at them though if you think they are wrong.

Plot twist 🔀:

Some of these "sins" are needed to get started with creation in the first place. Some of the sins listed even might be contradictory. I will give some examples, maybe you can come up with points for the others:

Do Neural networks make writers and artists obsolete? 🧠

Neural Networks have a parameter called "Temperature" that attempts to replicate that randomness that seems natural to text written by humans (or any other kind of human creation 🖌 ). If no randomness or temperature is added to the neural networks output its writings seems to become stale, repetitive and therefor less useful.

I can highly recommend Stephen Wolframs article on the matter. Especially when he talks about temperature 🌡

https: Stephen Wolfram explains chatgpt

As large language models scale 📈 we start to bare witness to the limits of the current specific neural network technology. The most important parameters that characterize model performance are training dataset size and the number of network parameters 📀.

Performance means: The certainty with which a large language model predicts the next word. This seems to approach a limit that can not be crossed without investing exponentially more compute. The entropy of natural language and text seems to hint at something these networks can not recreate without having content remotely like it in the hidden dark and dusty corners of their training-data-set first 🕸 .

A great YouTuber Video on the matter

Bye

Thank you for reading my Article or rant ~ Artiosis.

- I plan to upload an emoji-free verstion of the article in the future ;)