Ai has become the big buzz word in tech, but alot of experts are starting to say the money side of it doesn’t really add up. Companies are investing billions into systems that ain’t actually profitable, and maybe never will be. Most of the so-called “growth” comes from creative accounting — like Microsoft calling server credits for OpenAi a $10 billion investment and then recording that as $10 billion in revenue. That’s not new money, that’s just money moving in circles.
Now states like Kansas are stepping up with laws to keep things in check. These new laws focus on privacy, data use, and banning certain foreign-owned Ai platforms from being used on state computers. It’s not about killing innovation — it’s about stopping misuse before it gets out of hand.
For example, Kansas banned some Ai software made by foreign adversaries from being used by state agencies or universities. They also made it illegal to create or share Ai-generated sexual or fake images without someone’s consent. Plus, the Kansas Supreme Court built a committee to look into how Ai can (and can’t) be used in the court system.
This means Ai companies working or selling in Kansas need to start thinking smarter — about where their data comes from, who controls their tech, and what happens if things go wrong. You can’t just build fast and hope for the best anymore. The laws are catching up, and that’s not a bad thing.
The hype might keep going for a bit, but when the bubble pops, the folks building real tools with real transparency will be the ones still standing.
As Kansas starts setting these limits, it’s showing other states how to mix innovation with accountability. Developers and small business owners now have to think twice about what Ai tools they use — and where they come from.
For freelancers or agencies like MKS Web Design, it’s also a reminder that transparency matters more than ever. Pick tools that respect user data, are based in trusted regions, and can prove what they’re doing under the hood.
This kinda regulation actually helps long-term thinkers. Instead of chasing every new Ai fad that pops up, it pays off to focus on tools that are proven, ethical, and stable. Ai can be super useful — but it’s gotta be used responsibly or it’ll just burn itself out.
The Kansas model shows that real progress happens when technology grows with guardrails. That’s not a bad thing — it’s a foundation for sustainable work.
For web design and site management, knowing this context really helps. A ton of Ai tools are now built into plugins, chatbots, analytics, and automation systems. But each one now carries legal, security, and privacy weight — especially in states like Kansas.
At the end of the day, design isn’t just how a website looks — it’s how it protects users. Staying up to date with Kansas’s Ai laws keeps you ahead of the curve and makes your work future-proof, not just fancy.