• 0 Posts
  • 26 Comments
Joined 1 year ago
cake
Cake day: July 9th, 2023

help-circle





  • For reasons that are not clear, any attempt to get a “corn dog” or “corndog” out of AI image generators gives bizarre results. The above was the closest I could get to a traditional corn dog, a frankfurter sausage on a stick, coated in cornbread. Most other results were clearly corn on the cob, or some other “thing on a stick” that was almost but not quite entirely not a corn dog.

    I have to conclude that, statistically, images on the web that reference corn and dog are more likely to contain corn on the cob, than the classic state fair concession.









  • Stanford Encyclopedia of Philosophy:

    https://plato.stanford.edu/entries/neoliberalism/

    This entry explicates neoliberalism by examining the political concepts, principles, and policies shared by F. A. Hayek, Milton Friedman, and James Buchanan, all of whom play leading roles in the new historical research on neoliberalism, and all of whom wrote in political philosophy as well as political economy. Identifying common themes in their work provides an illuminating picture of neoliberalism as a coherent political doctrine.

    But several recent book-length treatments of neoliberalism (Burgin 2012; Biebricher 2018; Slobodian 2018; Whyte 2019) have helped give form to an arguably inchoate political concept. As Quinn Slobodian argues,

    in the last decade, extraordinary efforts have been made to historicize neoliberalism and its prescriptions for global governance, and to transform the “political swearword” or “anti-liberal slogan” into a subject of rigorous archival research. (2018: 3)

    Along similar lines, Thomas Biebricher (2018: 8–9) argues that neoliberalism no longer faces greater analytic hurdles than other political positions like conservatism or socialism.

    In light of this recent historical work, we are now in a position to understand neoliberalism as a distinctive political theory. Neoliberalism holds that a society’s political and economic institutions should be robustly liberal and capitalist, but supplemented by a constitutionally limited democracy and a modest welfare state. Neoliberals endorse liberal rights and the free-market economy to protect freedom and promote economic prosperity. Neoliberals are broadly democratic, but stress the limitations of democracy as much as its necessity. And while neoliberals typically think government should provide social insurance and public goods, they are skeptical of the regulatory state, extensive government spending, and government-led countercyclical policy. Thus, neoliberalism is no mere economic doctrine.

    … etc …



  • I think that in the minds of Friedman, Hayek, Mises et. al. (who coined the term neoliberal after WW2), it was meant to marry modern pro-market economic ideas (the “neo” part) with classically liberal social ideals, reaching back to the Enlightenment. I think they intended it as a counter to socialism, which combined anti-market ideas with regressive ideas around social and civil liberty (at least, in practical application in the wake of WW2).

    But yes, in modern parlance it is often a slur aimed at pro-corporate capitalist kleptocracy.





  • In the real world, artists pay their way by doing commercial work, or holding down a day job as a graphic designer, etc. Actors do commercials and Hallmark specials while looking for their break into serious theater. Writers put in hours writing ad copy or translating or speechwriting while trying to sell the Great American Novel. You call it poison, but ultimately it puts food on the table for artists and their families.

    These roles can ONLY be displaced if AI is allowed to steal everyone’s work, and flood all available channels with mediocre AI paraphrases and transcriptions of that work. That’s the decision point we’re facing right now – do we stand idly by and allow big tech to replace workers by copying the fruits of human labor without compensation?

    We can debate whether AI output is “good enough” for various use cases. And in some cases, you’ll be absolutely right that AI will never produce a convincing product for particular use cases. But that’s not the issue. The issue is whether it’s right for companies to steal the work of humans to use as training inputs, and flood the market with that mediocre output. AI producing sh*tty output doesn’t make it morally acceptable to steal, and to profit from the stealing.