I'm not a programmer, so what you just said is all Greek to me, but I'll take your word for it that what you described represents a significant departure from the expectations that the AI horny would lead one to have concerning the capabilities of the product. But they can always respond that these are problems that are solvable, and with the technology in a constant state of flux we can expect that in the coming years things will only continue to improve, since it was only very recently that even that level of functionality wasn't possible. My concerns with AI go beyond that, though, to problems that don't seem to be solvable in the short term and that have only gotten worse in recent years. These are more business-related than technology-related (though the limitations of the technology do factor in), and threaten the entire viability of AI as an industry.
I use Photoshop quite a bit. During the pandemic, though, my graphics card crapped out, and since they were in short supply, I replaced it with an old one from 2014 I had lying around. Since I don't play games or anything this was a perfectly acceptable solution, except that at some point newer versions of Photoshop started offloading some of the workload to the graphics card, for which mine was hilariously out of date. While the newer versions technically worked, there was a certain wonkiness that prevented me from adopting them full-time, and I continued using an install of Photoshop 2018, which was more than adequate for my purposes. In the meantime, I noticed that a newer version I had installed had incorporated "neural filters" aka AI into the program, which of course it did, and I fooled around with this a bit. Some functions were fun, if limited, while others, like upscaling and automatic scratch removal, didn't seem to do anything useful. But whatever. A few weeks ago I finally got a new graphics card after the old one gave up the ghost, and I looked into Photoshop 2026 to see what had changed since 2025. The answer was that the updates were basically all AI-driven, and not in a good way.
Adobe has been a convenient punching bag for the enshittification trend as of late, and the purpose of this post isn't to pile on, but to illustrate how it's representative of a greater rot in the software business and how AI only seems to accelerate that rot. Like previous iterations, some of these AI features are impressive, and some or stupid, but all of them cost extra. The way it works is that you get a certain amount of credits depending on your subscription (and as a long-time customer of the Photoshop-only plan I get a generous number of credits), and each time you use one of these features it costs a certain number of credits. And if you run out you can't just buy more, but you have to upgrade your subscription, and I already get the most credits you can with an out -of-the-box subscription that doesn't involve going through their sales department. To make matters worse, determining how many credits a given action will cost isn't based on a set rate but depends on 900 different factors, and is so complicated that the software can't even tell you how much an action will cost before it's run. And as a final blow, they don't even provide a way of telling you how many credits you have remaining; you eventually just get a message that you've run out.
The latter problem is obviously part of Adobe's slimy sales tactics where they want users to be unable to plan ahead so that they unexpectedly run out of credits in the middle of a time-sensitive project and are forced to upgrade, so I can choke that up to normal corporate bullshit. The former problem is due to the fact that there is simply no way of predicting how much compute an AI system is going to use until it's already used it. The real kicker is that, due to the inherently unpredictable nature of generative AI, you don't even know if the command is going to achieve the desired result, or how many attempts and tweaks it will take to get the desired result, and it may take multiple, expensive generations just to get something usable. The result is that the function is inherently self-defeating. There are lots of Photoshop functions that may require tweaking or not work at all, but they're integral parts of the software and aren't costing the user anything but time if they don't get things right on the first try, and the individual user will get more proficient with experience. The AI features are simply a black box that requires you to throw an unknown amount of money at it and hope it does what you want it to. I, as a user, thus am disincentivized to bother learning how to use these features because my access to them is liable to be cut off at any moment, whereas my existing workflow works fine as it is.
This is basically the problem with the whole "AI as a service" model these companies all seem to be banking on. If the response to Photoshop 2026 is any indication, customers want cost predictability and function predictability. If Microsoft Word cut you off after 1 million words per month it would seem less like you were buying software and more like a free trial. It would be even worse if the number of words you were allowed to type depended on font, font size, formatting, etc., and you didn't know how many credits each action you would take and were liable to be cut off while in the middle of writing something important. Luckily, I can use Word to my heart's content without it costing Microsoft any extra, so they have no reason to impose such a restriction. With generative AI, on the other hand, every action costs the company money, whether it benefits the customer or not, and the company can't predict in advance how much money that's going to be. So there's no way an AI company can realistically charge based on use without pissing off their customer base, who will cancel after getting that first $75,000 bill in the mail that no, they aren't paying.
Charging a flat monthly fee for unlimited usage doesn't solve this problem so much as stick the provider with the bill instead of the customer, so most of the AI services have resorted to a deceptive hybrid model where it looks like you're getting unlimited usage but has asterisks stating that it's subject to a cap, which caps are never explicitly defined. Some charge a monthly fee for access to a certain number of credits, which don't roll over at the end of the month. I'd find a lot to criticize about these models, which wouldn't fly in any normal business sales situation and would be relegated to the scummy end of the consumer pool in any other context, except that they still manage to lose money for the big players. Third-party agent developers may be profitable, but it's only because they're already buying their compute at a discount.
The only conclusion I can draw from all this is that software as a service, while loathed by customers, isn't really beneficial to companies either, other than as a cheap way of temporarily boosting numbers. And that's indicative of a deeper problem in the tech industry as a whole, a problem of their own making. From the 1980s through the 2000s, the computer industry grew exponentially. In the 1970s computers were things that large corporations and government agencies had to manage large databases. In the 1980s they became productivity tools that every employee had on his desk. By the mid-90s, home adoption had started in earnest, and by the end of the decade practically everyone had one. In ten years the internet went from being a hyped curiosity to an essential utility. The technology was also changing quickly, and the improvements were massive. In 1994, a typical home PC had a 486 processor clocked at 66 MHz, 8 MB of RAM, and a 500 MB hard drive. It would run Windows 3.1, which would be replaced a year later with Windows 95, a huge upgrade. Five years later that computer would be hopelessly obsolete; in 1999 a comparable build would have a 450MHz Pentium II, 128 MB of RAM, and a 13 GB hard drive. It would run Windows 98, which would be replaced 2 years later with Windows XP, and even bigger upgrade that eliminated the finickiness of DOS once and for all.
By 2010 CPUs would be clocked in the gigahertz and run multiple cores, RAM would be measured in gigabytes, and external hard drives of more than 1 TB would be affordable. Windows 7 was released the year prior to great acclaim. To put all that in perspective, I'm currently writing this on a Lenovo Thinkpad from 2024 that has the same amount of RAM as the currently-avalable model, which has the same amount of RAM as my home PC build from 2019. Or 2018; I can't remember the year I last did a major upgrade, but I haven't done any since before the pandemic, aside from the aforementioned graphics card. I haven't needed to upgrade it either, as there hasn't been any decline in performance in the tasks I actually use it for. And even that upgrade didn't appreciably improve performance from the 2014 gear I was running before that. Windows 7 was the last Windows release that was universally loved; every one since then has been met with varying degrees of derision. There had been flops before, but Vista was too far ahead of its time to be usable, and ME was a half-assed stopgap that never should have been released. The only mistake in this vein since then was 8, which completely misread the future of computing. Every new Windows since then has been an unexciting incremental upgrade that would probably have worked just as well as a security patch for 7.
I don't want to overstate my case here and suggest that computers haven't improved in the last 15 years; I'm sure my 2014 build would be woefully inadequate by today's standards. The point is that the advances aren't coming as fast as they did in years previous, and when they do come the improvements are more subtle. It feels like 2010 was the year that computer technology reached a mature phase where all adults, even your grandparents, knew how to use it, and good technology was as cheap as it was going to get. This wasn't clear at the time, but in a few years it was apparent that things had stagnated. In the early 2010s I listened to TWIT semi-regularly, and it didn't seem like there was much to get excited about. The two big things that the industry was pushing as the next frontier at the time were wearables and IOT devices. The former flopped spectacularly. The latter had better market penetration, though some of the implementations were ridiculous, and the whole concept has since become a metaphor of how technology has gone too far, trading simplicity and security for dubious functionality. As hardware stagnated, software quickly followed suit. Improvements in software follow improvements in hardware, and with hardware capability virtually unlimited, there was nowhere left to go. Sure, there would always be new features, support for new devices, and better security, but the game-changing upgrades seemed like a thing of the past.
So take a program like Photoshop that was first released in 1990 and had improved leaps and bounds by the time CS6 was released in 2012. A lot of users contend that this was peak Photoshop and that everything since then has been unnecessary bloat. I am not one of those people; the current software is significantly better. But CS6 was also the last version to be sold as a standalone product. Adobe had good reasons for doing this at the time—Photoshop was an incredibly expensive professional grade product that also had broad-based appeal. This meant that it was particularly susceptible to piracy, and lost more money to piracy than more modestly-priced products. They had tried to combat this in the past by releasing less expensive consumer-grade versions like Elements, but these never really took off, as consumers felt like they were missing something (most notably, Elements did not provide access to curves, which every photography book agreed was an essential tool). The decision to go subscription would give consumers access to an always-up-to-date full version of the product for less than it would cost to upgrade every other release.
The crowd who insists that CS6 is better is dwindling now, but even in its heyday it was mostly composed of people who had never actually paid for Photoshop and were mad that it was more difficult to pirate. But when Creative Cloud was first released in 2013, much of the criticism came from professionals and actual customers who were concerned about the new model. Sure, it was cheap now, but what was stopping them from jacking up the price in the future? Creative professionals aren't exactly the most highly paid. In the past one could upgrade whenever he could afford to and, if necessary, stick with a legacy version until things improved. But making one's continued access to software they needed for their job dependent on paying a ransom that they might not be able to afford was a different story. The reaction may have been better if CC offered a significant upgrade over CS6, but rather than wait a few years and offer a significantly improved version, CC came out earlier than one would expect and didn't offer much of an upgrade. Accordingly, the new subscription model was the only noteworthy thing about it. To Adobe's credit, the subscription price didn't change at all for over a decade, but in hindsight, there weren't any game-changing upgrades, only incremental improvements. If the company had simply relied on customers paying full price to upgrade whenever they felt it was worth it, they may have been waiting a long time.
As SaaS has matured from those early days, it has become less about preventing piracy and more about anxiety that newer products won't differentiate themselves enough from the old to merit the user to upgrade. Better instead to lock in that revenue stream with a user subscription that's impossible to cancel short of telling the bank to stop paying. Unfortunately, as a business move it's a one-time thing; make the number go up as all the old customers switch to subscriptions, but once they're aboard, the line flattens out again. In normal industries, this isn't a problem. In the computer industry, 30 years of exponential growth being not only welcomed but expected meant that the situation was unacceptable. Since there was nowhere left to go technologically, the industry had to resort to cheap gimmicks to keep the numbers up. SaaS was one. The aforementioned IoT was another; nothing better than announcing huge deals with appliance manufacturers who will be integrating your products. The problem with gimmicks like this is that, while they can increase revenue, they have a shelf life. A deal with Whirlpool to make a smart fridge may make both of your numbers go up, but once you have computers in every fridge sold, exponential growth is no longer possible. By the 2020s, the tech industry was running out of gimmicks. I think the reason Apple became the top dog during this period is because they were the only tech company that didn't seem to be peddling bullshit. I had a friend who was in and out of tech startups during this period (I even interviewed at one of his companies), and every idea was based on a free service that was really just scaffolding for advertising or data harvesting. A company like Apple that still sold products and services they expected customers to pay for was an outlier indeed.
So AI came to save the day. I'm not denying the fact that the technology is impressive and potentially useful, but it is just about the biggest gimmick one could imagine. Because simply being impressive and useful puts it in about the same league as, well, Photoshop, which, even in its first iteration, was a revolution to anyone who had ever worked in a darkroom. Unlike Photoshop, though AI promises to solve not one particular problem, but all of the problems, including ones that haven't been identified yet. This latter point is particularly salient, because exponential growth in the tech sector was never based on the present, but on the future. If the tech industry in the 2010s looked like it was in danger of stagnating and becoming a normal industry, in the 2020s the sky was the limit. It was now worth it for capital to invest all of the money in AI companies, because if they were successful, then money wouldn't matter anyway.
And if they weren't successful? Well, they never considered that possibility, because the line only moves in one direction. The equation is pretty simple: If AI companies are successful, then your support was worth it and will be repaid. If they aren't successful, then you need to give them more money. But what happens when the money isn't there? How good Photoshop's AI features are is ultimately secondary to how much they cost. Someone has to pay for them, be it the customer or Adobe. Some companies may be willing to subsidize AI, but if Adobe is willing to give product away for free, they'd do better by dumping CC and charging $500 for CS7, but we know that ain't going to happen. Instead, they've raised subscription prices by 50% in an attempt to get customers to pay for the privilege of having access to functionality they have to pay extra for if they actually want to use. I doubt it's a coincidence that the first substantial price hike in the history of CC coincides with the introduction of the expensive AI upgrades. I doubt Adobe will suffer much for it, because their business (like Apple's) is actually sound, and their products indispensable, but it's indicative of the perversion that's at the center of the tech world. Eventually, somebody is going to expect to get paid, and the party will be over. And as I write this, I don't see any scenario where the money is going to be there.
I'm not a programmer, so what you just said is all Greek to me, but I'll take your word for it that what you described represents a significant departure from the expectations that the AI horny would lead one to have concerning the capabilities of the product. But they can always respond that these are problems that are solvable, and with the technology in a constant state of flux we can expect that in the coming years things will only continue to improve, since it was only very recently that even that level of functionality wasn't possible. My concerns with AI go beyond that, though, to problems that don't seem to be solvable in the short term and that have only gotten worse in recent years. These are more business-related than technology-related (though the limitations of the technology do factor in), and threaten the entire viability of AI as an industry.
I use Photoshop quite a bit. During the pandemic, though, my graphics card crapped out, and since they were in short supply, I replaced it with an old one from 2014 I had lying around. Since I don't play games or anything this was a perfectly acceptable solution, except that at some point newer versions of Photoshop started offloading some of the workload to the graphics card, for which mine was hilariously out of date. While the newer versions technically worked, there was a certain wonkiness that prevented me from adopting them full-time, and I continued using an install of Photoshop 2018, which was more than adequate for my purposes. In the meantime, I noticed that a newer version I had installed had incorporated "neural filters" aka AI into the program, which of course it did, and I fooled around with this a bit. Some functions were fun, if limited, while others, like upscaling and automatic scratch removal, didn't seem to do anything useful. But whatever. A few weeks ago I finally got a new graphics card after the old one gave up the ghost, and I looked into Photoshop 2026 to see what had changed since 2025. The answer was that the updates were basically all AI-driven, and not in a good way.
Adobe has been a convenient punching bag for the enshittification trend as of late, and the purpose of this post isn't to pile on, but to illustrate how it's representative of a greater rot in the software business and how AI only seems to accelerate that rot. Like previous iterations, some of these AI features are impressive, and some or stupid, but all of them cost extra. The way it works is that you get a certain amount of credits depending on your subscription (and as a long-time customer of the Photoshop-only plan I get a generous number of credits), and each time you use one of these features it costs a certain number of credits. And if you run out you can't just buy more, but you have to upgrade your subscription, and I already get the most credits you can with an out -of-the-box subscription that doesn't involve going through their sales department. To make matters worse, determining how many credits a given action will cost isn't based on a set rate but depends on 900 different factors, and is so complicated that the software can't even tell you how much an action will cost before it's run. And as a final blow, they don't even provide a way of telling you how many credits you have remaining; you eventually just get a message that you've run out.
The latter problem is obviously part of Adobe's slimy sales tactics where they want users to be unable to plan ahead so that they unexpectedly run out of credits in the middle of a time-sensitive project and are forced to upgrade, so I can choke that up to normal corporate bullshit. The former problem is due to the fact that there is simply no way of predicting how much compute an AI system is going to use until it's already used it. The real kicker is that, due to the inherently unpredictable nature of generative AI, you don't even know if the command is going to achieve the desired result, or how many attempts and tweaks it will take to get the desired result, and it may take multiple, expensive generations just to get something usable. The result is that the function is inherently self-defeating. There are lots of Photoshop functions that may require tweaking or not work at all, but they're integral parts of the software and aren't costing the user anything but time if they don't get things right on the first try, and the individual user will get more proficient with experience. The AI features are simply a black box that requires you to throw an unknown amount of money at it and hope it does what you want it to. I, as a user, thus am disincentivized to bother learning how to use these features because my access to them is liable to be cut off at any moment, whereas my existing workflow works fine as it is.
This is basically the problem with the whole "AI as a service" model these companies all seem to be banking on. If the response to Photoshop 2026 is any indication, customers want cost predictability and function predictability. If Microsoft Word cut you off after 1 million words per month it would seem less like you were buying software and more like a free trial. It would be even worse if the number of words you were allowed to type depended on font, font size, formatting, etc., and you didn't know how many credits each action you would take and were liable to be cut off while in the middle of writing something important. Luckily, I can use Word to my heart's content without it costing Microsoft any extra, so they have no reason to impose such a restriction. With generative AI, on the other hand, every action costs the company money, whether it benefits the customer or not, and the company can't predict in advance how much money that's going to be. So there's no way an AI company can realistically charge based on use without pissing off their customer base, who will cancel after getting that first $75,000 bill in the mail that no, they aren't paying.
Charging a flat monthly fee for unlimited usage doesn't solve this problem so much as stick the provider with the bill instead of the customer, so most of the AI services have resorted to a deceptive hybrid model where it looks like you're getting unlimited usage but has asterisks stating that it's subject to a cap, which caps are never explicitly defined. Some charge a monthly fee for access to a certain number of credits, which don't roll over at the end of the month. I'd find a lot to criticize about these models, which wouldn't fly in any normal business sales situation and would be relegated to the scummy end of the consumer pool in any other context, except that they still manage to lose money for the big players. Third-party agent developers may be profitable, but it's only because they're already buying their compute at a discount.
The only conclusion I can draw from all this is that software as a service, while loathed by customers, isn't really beneficial to companies either, other than as a cheap way of temporarily boosting numbers. And that's indicative of a deeper problem in the tech industry as a whole, a problem of their own making. From the 1980s through the 2000s, the computer industry grew exponentially. In the 1970s computers were things that large corporations and government agencies had to manage large databases. In the 1980s they became productivity tools that every employee had on his desk. By the mid-90s, home adoption had started in earnest, and by the end of the decade practically everyone had one. In ten years the internet went from being a hyped curiosity to an essential utility. The technology was also changing quickly, and the improvements were massive. In 1994, a typical home PC had a 486 processor clocked at 66 MHz, 8 MB of RAM, and a 500 MB hard drive. It would run Windows 3.1, which would be replaced a year later with Windows 95, a huge upgrade. Five years later that computer would be hopelessly obsolete; in 1999 a comparable build would have a 450MHz Pentium II, 128 MB of RAM, and a 13 GB hard drive. It would run Windows 98, which would be replaced 2 years later with Windows XP, and even bigger upgrade that eliminated the finickiness of DOS once and for all.
By 2010 CPUs would be clocked in the gigahertz and run multiple cores, RAM would be measured in gigabytes, and external hard drives of more than 1 TB would be affordable. Windows 7 was released the year prior to great acclaim. To put all that in perspective, I'm currently writing this on a Lenovo Thinkpad from 2024 that has the same amount of RAM as the currently-avalable model, which has the same amount of RAM as my home PC build from 2019. Or 2018; I can't remember the year I last did a major upgrade, but I haven't done any since before the pandemic, aside from the aforementioned graphics card. I haven't needed to upgrade it either, as there hasn't been any decline in performance in the tasks I actually use it for. And even that upgrade didn't appreciably improve performance from the 2014 gear I was running before that. Windows 7 was the last Windows release that was universally loved; every one since then has been met with varying degrees of derision. There had been flops before, but Vista was too far ahead of its time to be usable, and ME was a half-assed stopgap that never should have been released. The only mistake in this vein since then was 8, which completely misread the future of computing. Every new Windows since then has been an unexciting incremental upgrade that would probably have worked just as well as a security patch for 7.
I don't want to overstate my case here and suggest that computers haven't improved in the last 15 years; I'm sure my 2014 build would be woefully inadequate by today's standards. The point is that the advances aren't coming as fast as they did in years previous, and when they do come the improvements are more subtle. It feels like 2010 was the year that computer technology reached a mature phase where all adults, even your grandparents, knew how to use it, and good technology was as cheap as it was going to get. This wasn't clear at the time, but in a few years it was apparent that things had stagnated. In the early 2010s I listened to TWIT semi-regularly, and it didn't seem like there was much to get excited about. The two big things that the industry was pushing as the next frontier at the time were wearables and IOT devices. The former flopped spectacularly. The latter had better market penetration, though some of the implementations were ridiculous, and the whole concept has since become a metaphor of how technology has gone too far, trading simplicity and security for dubious functionality. As hardware stagnated, software quickly followed suit. Improvements in software follow improvements in hardware, and with hardware capability virtually unlimited, there was nowhere left to go. Sure, there would always be new features, support for new devices, and better security, but the game-changing upgrades seemed like a thing of the past.
So take a program like Photoshop that was first released in 1990 and had improved leaps and bounds by the time CS6 was released in 2012. A lot of users contend that this was peak Photoshop and that everything since then has been unnecessary bloat. I am not one of those people; the current software is significantly better. But CS6 was also the last version to be sold as a standalone product. Adobe had good reasons for doing this at the time—Photoshop was an incredibly expensive professional grade product that also had broad-based appeal. This meant that it was particularly susceptible to piracy, and lost more money to piracy than more modestly-priced products. They had tried to combat this in the past by releasing less expensive consumer-grade versions like Elements, but these never really took off, as consumers felt like they were missing something (most notably, Elements did not provide access to curves, which every photography book agreed was an essential tool). The decision to go subscription would give consumers access to an always-up-to-date full version of the product for less than it would cost to upgrade every other release.
The crowd who insists that CS6 is better is dwindling now, but even in its heyday it was mostly composed of people who had never actually paid for Photoshop and were mad that it was more difficult to pirate. But when Creative Cloud was first released in 2013, much of the criticism came from professionals and actual customers who were concerned about the new model. Sure, it was cheap now, but what was stopping them from jacking up the price in the future? Creative professionals aren't exactly the most highly paid. In the past one could upgrade whenever he could afford to and, if necessary, stick with a legacy version until things improved. But making one's continued access to software they needed for their job dependent on paying a ransom that they might not be able to afford was a different story. The reaction may have been better if CC offered a significant upgrade over CS6, but rather than wait a few years and offer a significantly improved version, CC came out earlier than one would expect and didn't offer much of an upgrade. Accordingly, the new subscription model was the only noteworthy thing about it. To Adobe's credit, the subscription price didn't change at all for over a decade, but in hindsight, there weren't any game-changing upgrades, only incremental improvements. If the company had simply relied on customers paying full price to upgrade whenever they felt it was worth it, they may have been waiting a long time.
As SaaS has matured from those early days, it has become less about preventing piracy and more about anxiety that newer products won't differentiate themselves enough from the old to merit the user to upgrade. Better instead to lock in that revenue stream with a user subscription that's impossible to cancel short of telling the bank to stop paying. Unfortunately, as a business move it's a one-time thing; make the number go up as all the old customers switch to subscriptions, but once they're aboard, the line flattens out again. In normal industries, this isn't a problem. In the computer industry, 30 years of exponential growth being not only welcomed but expected meant that the situation was unacceptable. Since there was nowhere left to go technologically, the industry had to resort to cheap gimmicks to keep the numbers up. SaaS was one. The aforementioned IoT was another; nothing better than announcing huge deals with appliance manufacturers who will be integrating your products. The problem with gimmicks like this is that, while they can increase revenue, they have a shelf life. A deal with Whirlpool to make a smart fridge may make both of your numbers go up, but once you have computers in every fridge sold, exponential growth is no longer possible. By the 2020s, the tech industry was running out of gimmicks. I think the reason Apple became the top dog during this period is because they were the only tech company that didn't seem to be peddling bullshit. I had a friend who was in and out of tech startups during this period (I even interviewed at one of his companies), and every idea was based on a free service that was really just scaffolding for advertising or data harvesting. A company like Apple that still sold products and services they expected customers to pay for was an outlier indeed.
So AI came to save the day. I'm not denying the fact that the technology is impressive and potentially useful, but it is just about the biggest gimmick one could imagine. Because simply being impressive and useful puts it in about the same league as, well, Photoshop, which, even in its first iteration, was a revolution to anyone who had ever worked in a darkroom. Unlike Photoshop, though AI promises to solve not one particular problem, but all of the problems, including ones that haven't been identified yet. This latter point is particularly salient, because exponential growth in the tech sector was never based on the present, but on the future. If the tech industry in the 2010s looked like it was in danger of stagnating and becoming a normal industry, in the 2020s the sky was the limit. It was now worth it for capital to invest all of the money in AI companies, because if they were successful, then money wouldn't matter anyway.
And if they weren't successful? Well, they never considered that possibility, because the line only moves in one direction. The equation is pretty simple: If AI companies are successful, then your support was worth it and will be repaid. If they aren't successful, then you need to give them more money. But what happens when the money isn't there? How good Photoshop's AI features are is ultimately secondary to how much they cost. Someone has to pay for them, be it the customer or Adobe. Some companies may be willing to subsidize AI, but if Adobe is willing to give product away for free, they'd do better by dumping CC and charging $500 for CS7, but we know that ain't going to happen. Instead, they've raised subscription prices by 50% in an attempt to get customers to pay for the privilege of having access to functionality they have to pay extra for if they actually want to use. I doubt it's a coincidence that the first substantial price hike in the history of CC coincides with the introduction of the expensive AI upgrades. I doubt Adobe will suffer much for it, because their business (like Apple's) is actually sound, and their products indispensable, but it's indicative of the perversion that's at the center of the tech world. Eventually, somebody is going to expect to get paid, and the party will be over. And as I write this, I don't see any scenario where the money is going to be there.
More options
Context Copy link