Vocab answers level c unit 12

/r/moviescirclejerk

2012.01.24 06:42 ElBeh /r/moviescirclejerk

"we live in a society" - frederick nietzsch
[link]


2011.08.28 00:51 xtc46 A subreddit for general treadmill enthusiasm

A subreddit for general weight training discussion, focused on intermediate level and above in experience and strength, for those ranging from strength sport competitors, sports that benefit from weight training, or weight training enthusiasts. Or for people to tell WeaponizedSleep to eat more.
[link]


2014.11.18 15:13 LGHyourinmaru Final Fantasy Brave Exvius

Final Fantasy Brave Exvius is a free-to-play role-playing game developed by Alim and published by Square Enix for iOS and Android devices. ::::: Join us on Discord ::::: https://discord.gg/ffbraveexvius
[link]


2024.06.01 15:23 InternationalRate176 idk what to do /vent

im new to this sub after years of finding myself here whenever i search in vein for answers as to why my mom is Like That™️.
On a certain level, i don’t think i have an Nmom. On a million other levels, i do. but i think a small part of me wants to believe she doesn’t act the way she does out of narcissism—she just “lacks self-awareness” or “is selfish” or “does things considering herself first and foremost.”
she’s never been verbally or physically abusive, but she was extremely emotionally neglectful growing up, putting me in daycare from 3 months until i was about 11-12 (i aged out of the daycare and that’s when she finally decided to start “trusting me at home alone”.) She always took control over raising me and my brother, seemingly so she could make us into carbon-copy good little christian girls (didn’t work). and now, as an adult, i can see she’s always had bad work-life balance and doesnt see a problem in it.
now that im an adult and back from time living abroad, she’s trying to help me find a job but seems to think she has a total understanding of the work landscape despite not having actually applied for a job in over a decade. With encouragement from my therapist, i finally stood up to her and told her to stop helping me since her help never amounts to anything, and she essentially told me no because “everyone needs help.”
this turned into a vent, and i could go on further but my phone is dying and i wanna share this so it doesnt all get delete lol.
submitted by InternationalRate176 to narcissisticparents [link] [comments]


2024.06.01 15:16 Ready_Amoeba5401 Is this laptop okay enough for its price?

Is this laptop okay enough for its price? submitted by Ready_Amoeba5401 to Tech_Philippines [link] [comments]


2024.06.01 15:11 ADavis0232 I need some advice on setting up a GB team.

I need some advice on setting up a GB team.
I've got a decent spread of champs and ive broke in GR1-19 though that's the only one so far. I usually put up about 12k blood on a NM3 run and I'd love to finally move into NM4
submitted by ADavis0232 to WatcherofRealmsGame [link] [comments]


2024.06.01 15:09 Fuzzyhoof26 Duodenal atresia - successful outcome

Hi Everyone,
First time posting as I wanted to share our experience of duodenal atresia and found this sub incredibly helpful when I was preparing for our little one to undergo surgery and be in the NICU. As duodenal atresia is fairly rare, I wanted to share our experience as the few stories I did read were incredibly informative.
For context, I was 32 weeks pregnant when at our third trimester scan a ‘double bubble’ was spotted on the ultrasound. My OBGYN thought it was most likely duodenal atresia and spoke to myself and my husband about further genetic testing as the condition is linked to Downs Syndrome. He also advised that our baby would need an operation soon after birth and I would likely develop a condition called polyhydramnious which would make early labour a possibility. All of this information was shocking and sudden as I had a fairly uneventful pregnancy up to this point.
My amniocentesis showed negative for Down Syndrome however, the double bubble persisted on my future scans so we prepared for a duodenal atresia diagnosis at birth.
I had a c-section at 37+4 (due in part to polyhydramnios) and our son was born weighing 8lbs 2oz. He was immediately taken to the NICU and we were able to go and see him later that day. Seeing him needing oxygen and with wires was something I had not fully prepared for but the NICU nurses were incredibly supportive and talked us through our son’s care in the lead up to his operation.
Our son had his operation on day three - the surgeon said his duodenum was larger than average post surgery - so we would have to wait to see how quickly he would progress. This was the most challenging part as until our son’s digestive system showed signs of working, he would not be able to come home. We were assured that this would take time but it didn’t make the wait any easier.
Our son began breast milk feeds on day four post surgery. He began on 3ml every three hours with the intent to gradually increase. His aspirate levels were checked at feeding times and he had a TPN line to ensure all his nutritional needs were met. This was a slow process and we needed lots of patience whilst celebrating the small victories of his feeds increasing and his aspirate reducing.
Ultimately, it was not until day eleven that he had a motion. However, this proved to be the turning point in his recovery and his progress rapidly improved. His aspirate began decreasing and within five days was almost at 0. His surgeon and paediatrician increased his milk levels in the morning and evening. Twenty one days after he was born and eighteen days after surgery, our son was consistently being fed 50ml eight times a day, having regular motions and no longer needed TPN or additional fluids. Finally it was time for him to come home.
For any parents facing a diagnosis of duodenal atresia, having your baby have major abdominal surgery so soon after birth is an incredibly challenging experience. I was assured by the high success rate of the surgery and valued the posts I found from other parents who had been through the experience. Happy to answer any questions from parents facing something similar.
Our baby is now four weeks old and thriving at home!
submitted by Fuzzyhoof26 to NICUParents [link] [comments]


2024.06.01 15:06 No_Cherry6771 My first ever heart kill

My first ever heart kill
And i did it by inflicting 999 poison within 5 turns. Theres an odd sense of satisfaction having poisoned the heart to death.
submitted by No_Cherry6771 to slaythespire [link] [comments]


2024.06.01 15:04 curium99 Building a partition wall for plaster finish

Hi, I am building a partition wall in a 1934 property between 2 bedrooms to create a built-in wardrobe for each bedroom.
There is a brick pillar at each end that as been finished with dot-and-dab plasterboard and skim or plaster direct to the brick. The pillar is 165mm face to face.
My partition wall is being constructed parallel to these pillars. I'm using metal c-studs to construct the partition. I'll then screw plasterboard to the studs and get a plasterer to skim finish.
I want to know the maximum I can be out with the existing face of the pillar (on each face of the partition wall) that can still be made up by the plasterer to give a level finish between the pillar and the partition wall? The metal c-stud track is 72mm wide and I'll screw on 12.5mm sound bloc plasterboard to each side of the partition. So before plastering, the partition will be 97mm thick leaving 68 mm difference.
I thought I could double-board one side of the partition which would increase the thickness to 109.5mm. That still leaves 55.5mm difference, split between each side that's roughly 28mm to be made up in skim on each side. Should I double-board each side? I'm looking for advice from plasterers on how to construct this partition that will make it easiest for the plasterer to get a good finish.
I should say that since one side of the partition will be the inside of the other bedroom's wardrobe so potentially I could sacrifice the finish on that side but I'm aiming to get the best finish on each side.
Please advise on how you'd proceed.
Many thanks
EDIT: - Perhaps I should ditch the metal c-stud and buy some 130mm thick timber?
submitted by curium99 to Plastering [link] [comments]


2024.06.01 15:01 FaithlessnessKey1726 Career dilemma—teaching or library?

(Skip to the end to see the informal poll and avoid the anxious ramble)
My first year of teaching was a disaster from beginning to end. I know most teachers’ first year is the worst and you feel like you don’t know what you’re doing bc you don’t know what you’re doing and there’s so much pressure. Etc.
Even beyond the more typical misery was a lot of personal life tumult and turmoil and trauma and chaos going on, including a debilitating (somewhat unofficial but more or less confirmed based on symptoms) diagnosis I have to live with now without having much insight as to prognosis. And a lot more discomfort involving loved ones.
Reflecting on this year is almost as traumatic as the experience itself. I had next to zero support, with the exception of about 2 weeks under the guidance of an amazing master teacher. But that was it. The morale at the school was beneath rock bottom. Every single day was worse than the day before. I tried to go in positive. But with very few exceptions, everyone was miserable and no one tried to hide it. People were directly rude to me, condescending, sarcastic, openly comtemptful, angry, hated the kids and cursed about them and screamed at them (“shut UP!!!!” “MORON! GET OUT!” “You’re STUPID, I should have LET that student hit you!” “I woulda hit you in the face too if you’d done something like that to me!” Just a few quotes off the top of my head, not to mention one slamming the door on my sped teacher’s face along with our sped students, which the principal did absolutely nothing about despite his friendship with the sped teacher). Discipline/behavior was an absolute JOKE. I think I’ve painted an accurate picture of how awful it was.
I guessed my way through everything but did my absolute best and figured everything out. A bit of productive struggle and hey, by the end of the year I was an expert in a lot of things I knew nothing about months earlier. My rapport with my students was great, to give myself some credit. They loved me. Albeit too much—they thought of my softness as a doormat. They felt free and liberated in my classroom bc I seldom raised my voice. Unfortunately what they’re accustomed to is only listening when yelled at, and as a new teacher, I did not have better tools to manage classroom behavior, beyond building relationships, and my class was a bit out of control. It became all about getting through the curriculum through the 3rd quarter.
My benchmark scores went up, which was pretty amazing considering everything. However at the very beginning of the 4th quarter my principal informed me that he wasn’t renewing my contract and that he would never let me teach 4th grade again, that “I don’t know if I would ever let you teach any grade level, maybe try pre-k—you get nap time and someone is always with you.” So he wrote off my career as an elementary teacher after just a few months of teaching. I could go in about how he had covertly brought in his very own former student (who had only recently began prepping to take the Praxis) as my replacement, unofficially “employed” but “technically not.” But I don’t want to get into that, as furious as it made me. I just stopped writing lesson plans bc no way was I gonna train her for free when they gave me zero support through the year.
I had way more bad days than good—the kids and my para got me through it! I was grateful for that. They were wonderful and I miss them. But I was made to feel incompetent. I slowly started to realize that him booting me was a blessing in disguise, especially after learning how many students I’d have had next year. And some other changes that won’t be helpful.
There’s also a lot of BS going on in our state regarding education. So things are not exactly going to get easier. Alas, I need a paycheck and I went to school and passed praxis to be a teacher. I’m 44 so it’s not like I have many options.
But I did actually finally get an interview at a library last week! I’d applied for 6 years and never got so much as a phone call. Unfortunately it’s part time and drastically less pay (which is honestly pitiful). And it would take me years to make close to what I make now. And I was just getting into certification so as a teacher I’d get a $10k raise. Buuut I really don’t want to miss a rare opportunity to get my foot in the door at the library!
I’ve got dozens of job offers in my district. I had 6 principals call me and email me yesterday alone! I know I could make decent money. But I don’t want to turn down the library job, which absolutely would not cut it financially.
I forgot to mention a key component of this dilemma: Teaching is extremely overstimulating to me. I’m autistic/adhd. This was part of my misery. Between my loud a/c units in my classroom and the kids noises desks constantly clanking and kids constantly talking over me etc etc etc, and the awful attitudes of most coworkers and all the other stuff, I barely made it to the end of the year. I know most of us actually feel that way, but my day to day in the classroom is beyond awful. I cried constantly, I had panic attacks going in every single morning during the 4th quarter after years of reduced panic attacks, most days I felt frustrated, and some days I even had moments where I could not even talk anymore and had to go home (these days where at least one kid told me to “Shut the f- - k up b-tch!” or fought or both plus admin treated me like crap and I had enough). Not to mention spending entire weekends and weeknights writing lesson plans, creating lessons, grading, entering grades, etc etc etc. All I could think about every day was how much I wished I could get a library job!! I even had a student tell me I would make a better librarian than teacher. She was excited when I got the call about the interview lol.
But what if my next school is better? What if I go in knowing expectations and having a better idea of how to do things and how to establish classroom procedures, what if it’s better? What if it’s stupid to give up on a better paying job? I’d love to get my MLIS but realistically, there aren’t very many librarian jobs and moving isn’t an option. The day to day would almost be worth the paycut. I’ve contemplated doing both, just for one year. I know that sounds nuts, and it’s risky, but what’s more important? My paycheck, or my mental health?! I honestly don’t know! I need the money. But I also need a peaceful environment.
Here are my options:
A) Substitute w library for almost the same money as I made uncertified, which was barely enough
B) Library + teaching full time bc you’re insane and unrealistic
C) Library only + MLIS bc it’s your dream & short term paycut is worth long term happiness.
D) Girl, are you insane?! Teaching only bc it’s the smart move!
submitted by FaithlessnessKey1726 to teaching [link] [comments]


2024.06.01 14:58 vozjaevdanil I can retrain him as Segundo VOL surely? Don't see any problems with his stats

I can retrain him as Segundo VOL surely? Don't see any problems with his stats submitted by vozjaevdanil to footballmanagergames [link] [comments]


2024.06.01 14:57 KeeganTroye [Live Text][5e][LGBTQ+ DM][Online] Her Dark Reflection: A Ravenloft Horror Campaign

Once upon a place outside of time…
*A goddess prepares for a fate she cannot comprehend– death is coming, dark mists envelope her existence her divinity dims and she will cast her magic for the last time. She does not choose her champions nor lay the work for a grand plan, there is no time, she twists the strings of fate into a noose and throws it out into the dreaming places with only hope to guide it… *
…you are asleep or in your approximation of such when the threads find you– tying your weave into a struggle against a shattered queen in a distant land. It is a moment in your life that is soon swallowed by reality and forgotten. But what is that drumbeat that plays in the back of your mind and the sense of dread it brings, that something is coming for you, and getting closer. And when you're finally sure it will hold back no longer-- another dream, a summons, the whispering of a witch who promises you answers if you would only step through the looking glass...
The Gritty Details.
New Campaign
Time: To be discussed and narrowed down as players join.
Session Length: 4~ hours [+ with group agreement.]
Starting Level: 2
Party Size: 5
System: 5th Edition Dungeons & Dragons
Medium: Discord (and later Foundry)
Who Am I?
Hi my name is Keegan (They/Them) a long term tabletop roleplayer, with nearly a decade of tabletop experience. I got into tabletop with Pathfinder, migrated to D&D5th and have had a lot of experience with the narrative game space primarily with Powered By The Apocalypse games. Something about 5E always brings me back, for the most part to Ravenloft where I've run five campaigns.
I live in the space of text-games, which I love as an amateur writer and due to the inclusive nature of the medium.
About the Game.
We’ll be running a homebrew domain of dread-- within the frigid lands of Kolm inspired by the geography and mythology of the nordic nations meet dark fairy-tales with a gothic horror spin you will find yourself fighting for your lives against the terrible Shattered Queen. Clockwork soldiers, a city of bells, and the dreaded crying knight are just some of the challenges to be overcome as you battle to escape the mists of Ravenloft!
Her Dark Reflection is a horror campaign and I lean into the quiet heavily with theming, there will be levity in the campaign and table banter is a thing, but I will always keep things pushing the envelope as far as maintaining a dark story-driven campaign. This does mean that the campaign is 18+.
This is a Level 2 to Level 12~ campaign - there is a lot of content and it might scale further with player approval.
I consider myself a primarily story driven DM, I tend to alternate combat and roleplay and would describe the game as 70/30 Roleplay/Combat.
I’d consider campaign inspirations as:
Folktales: Vasilisa, Baba Yaga, The Ice Queen.
Novels: The Hunchback of Notredame, Snow White, Through the Woods: Stories, Cursed: An Antholog
Games: Amnesia Series, Alice: Madness Returns, Dark Souls
Characters!
The game will include space for the inclusion of your backstories right from the get go, something I love doing is tying in characters to what they’re playing and giving unique arcs to each player.
For character options, I allow ALL published material, ALL up to date Unearthed Arcana, and a selection of approved homebrew. Other homebrew is allowed if run by me first, and I will adjust for balance.
If you present an evil PC I expect you’ll be able to play with the party sufficiently to complete the campaign.
The Players.
I'm looking for regular sessions and the campaign will be either once a week or twice a month at worst. The main thing I want is reliable attendance!
This game is LGBTQ+ and BIPOC friendly and will be inclusive in the worldbuilding, triggers will be discussed and safety tools implemented to protect players. I want my games to be safe.
The System.
The game will be run initially through discord– until I have set myself up having recently moved, and gotten myself a PC which should be soon and then it will migrate to Foundry. Due to this you will need to have a PC or laptop that is new-ish (anything in the last decade should be fine).
An Application Form!
You can fill this in and post here, DM me, use Reddit chat, or add me on Discord, handle: keegantroye
You (Sell yourself as a player here!)
  • Handle(And pronoun):
  • Anything You'd Like To Share About Yourself:
  • Gaming Experience:
  • What Kind Of Games Do You Like?
Character (if you don't have an idea a previous character of yours to give an example of a character you've created)
  • Handle(And pronoun):
  • Class Role Concept:
  • Backstory Elevator Pitch:
submitted by KeeganTroye to lfg [link] [comments]


2024.06.01 14:56 Technical_Lab_747 Thoughts on bloods

Thoughts on bloods
Can someone breakdown my bloods? I dunno what it all means
submitted by Technical_Lab_747 to trt [link] [comments]


2024.06.01 14:52 glassfeathers Anyone feel like critiquing my Resume?

I have been applying to consultant roles like crazy and I'm not making any headway. Can anyone think of ways I can improve what I have below or what other career avenues I should pursue?
EXPERIENCE
Homecare Homebase, LLC, Dallas, TX Implementation Training Consultant, Jan 2022 - Present
Provide quality implementations and ensure the adoption of products within health agencies across the country,
Guide customers and internal training teams through the deployment of the software for multi-site locations across the United States, Lead clients through training of the software and support clients throughout onsite and remote deployment,
Assist in making customizations to clinical content (i.e., visit types, pathways, and assessments) to ensure the environment is ready for testing and training,
Collaborate with internal staff on training preparations for assigned projects, including completing preparation of training database prior to customer deployment,
Communicate effectively and understand the customer's needs, goals, and strategies, as well as communicate those needs to the internal HCHB project team for action and/or resolution,
Independently researching customer inquiries and determining sources of issues. Independently researching data functionality and reporting problems to the customer support team with needed steps for customer resolution,
Maintaining product knowledge as new enhancements and functionality are released in the application,
UNIVERSITY OF NORTH TEXAS, DIVISION OF ADVANCEMENT, Denton, Texas Student Assistant, May 2019 - Aug 2020
Operate office equipment such as fax machines, copiers, and phone systems, and use computers for spreadsheets, word processing, database management, and other applications,
Answer telephones and give information to callers, take messages, or transfer calls to appropriate individuals,
Set up and maintain paper and electronic filing systems for records, correspondence, and other material,
Maintained scheduling and event calendars, Operate electronic mail systems and coordinate the flow of information, internally or with other organizations,
CORELOGIC, Irving, TX Operations Specialist, Jul 2014 - Jul 2018
Conducts research of data, perform factual analyses of loan histories to assess appropriate resolution and compile all corresponding supporting documentation for review,
Leverage all available resources such as existing databases, third-party sources, and/or public information on the internet,
Utilize tools to analyze, query, and manipulate data according to defined business procedures,
Extract and enter appropriate data onto application, other form, or database, Process payments that may require preparing, balancing, and submitting complex wire disbursements for mass manual payments,
Proofread documents for grammatical, mathematical, typographical, and composition errors adhering to standards and guidelines, Perform root cause analyses of all discovered errors in order to identify areas requiring development or enhancement of training, policy, or procedures,
Adhere to all required company and client driven Service Level Agreements, Perform duties such as maintaining all levels of files, searching and investigating information contained in files, processing departmental documents requiring more specific knowledge of functional operations, and entering report results into tracking systems,
Track progress and provide status report to management on achievement of daily and weekly team goals,
Act as Subject Matter Expert on internal and client projects, making recommendations to these processes,
Able to develop and maintain business relationships with individuals representing other departments and/or representing outside organizations,
Maintain detailed and comprehensive records for all complaints, including error findings, during the complaint investigation process and document corrective actions taken,
EDUCATION
University of North Texas Denton, TX
Bachelor of Arts (B.A.) History (Apr 2021) GPA: 3.2
University of North Texas Denton, TX
Bachelor of Arts (B.A.) Digital and Print Journalism (Apr 2021) GPA: 3.2
Relevant Coursework: Strategic Social Media, Principles of Public Relations, Applied Design for Advertising and Public Relations,
ADDITIONAL SKILLS Microsoft Office MSP Fiserve Black Knight HTML Search Engine Optimization Adobe Suite Oracle ERP BlackLine
submitted by glassfeathers to careerguidance [link] [comments]


2024.06.01 14:45 KpopRates The "2021 in K-Pop" Rate, Day 1: Happy Robbed Day! Happy Happy Worst Day!

Welcome, everyone, to Day 1 of the 2021 in K-Pop rate! We are traveling back in time to a monumental year, both for K-Pop and for the entire world. We will be revealing the bottom 40 songs today.
Day 1 will begin at 9 AM Pacific, about ~3 hours from when this post goes up. For those following the song rate live during this time, we highly encourage you to follow both this post and the live chat on our community Discord: https://discord.gg/FquKMgz9EU, as we will be using both for chatting/communication.
If you are here after the live results-reveal watch session, we would encourage you to avoid spoilers by going through the comments section one-by-one, as the comments will be sorted by Old after the live watch session so you can simulate the ranking reveal for yourself.

STATS

Participants: 64 participants
Average score: 7.013
Average controversy score: 2.090 (Any song with a controversy score higher than this is considered to be divisive)

SONG LIST:

.

January:

(G)I-DLE - Hwaa
Epik High - Rosario
Cherry Bullet - Love So Sweet
Dreamcatcher - Odd Eye
IU - Celebrity

February:

SHINee - Don't Call Me
Sunmi - Tail
ONF - Beautiful Beautiful

March:

Rosé - On The Ground
Pentagon - Do or Not
WOODZ - Feel Like
Weeekly - After School
IU - Lilac
Baekhyun - Bambi
WJSN - Unnatural

April:

Hoshi - Spider
STAYC - ASAP
SHINee - Atlantis
Enhypen - Drunk-Dazed
ITZY - In The Morning

May:

NCT Dream - Hot Sauce
Oh My Girl - Dun Dun Dance
WJSN The Black - Easy
Yuqi - Bonnie & Clyde
aespa - Next Level
fromis_9 - We Go
Taemin - Advice
Enhypen - Fever
BTS - Butter
Everglow - First
TXT - Lovesong

June:

Monsta X - Gambler
Lightsum - Vanilla
ONEWE - Rain To Be
LOONA - PTT (Paint The Town)
NCT Dream - Hello Future

July:

Taeyeon - Weekend
AKMU (with IU) - Nakka
Dreamcatcher - BEcause

August:

Jeon Somi - Dumb Dumb
TXT - Loser=Lover
Brave Girls - After We Ride
Stray Kids - Thunderous

September:

STAYC - Stereotype
Purple Kiss - Zombie
NCT 127 - Sticker
Lisa - Money
ITZY - Loco
KEY - Bad Love

October:

Twice - The Feels
aespa - Savage
Jo Yuri - Glassy
TRI.BE - Would You Run
Seventeen - Rock With You

November:

The Boyz - Maverick
Billlie - Ring X Ring
Chungha - Killing Me

December:

IVE - Eleven
Xdinary Heroes - Happy Death Day
ATEEZ - The Real
.

Bonus Rate:

Treasure - Beautiful
The Boyz - Kingdom Come
Joy - Hello
BDC - Moonlight
Golden Child - Ddara
Weki Meki - Siesta
Everglow - Pirate
.
submitted by KpopRates to kpoprates [link] [comments]


2024.06.01 14:38 Crimson_SS9321 Bhagwa Atheism

What is Bhagwa Atheism ?

A hypocritical and privileged form of atheism that is critical against any religion (particularly Islam), but harbours sympathy for his religion in which they were born i.e. Hinduism.

Why is this sympathy?

The answer is simple ...casteism, we shall discuss this later on how this is deeply rooted in caste based privileges that facilitates such thinking. To understand this let me explain from my point of view:
According Marxism, in a feudalist/semi-feudalist/capitalist society there are two types of classes mostly, namely: the Bourgeois (the ruling class who controls the means of production and generates profit from the exploitation of working classes labour power) and The Workers (or Proletariats) (who do not own any means of production and has no power to purchase labour power of others, and they survive solely by selling their own labour power).
These classes emerge only at a certain stage in the development of the productive forces and the social division of labour, when there exists a social surplus of production, which makes it possible for one class to benefit by the expropriation of another. The conflict between classes there begins, founded in the division of the social surplus, and constitutes the fundamental antagonism in all class.
This holds true in case of several capitalist nations such as US, Canada, South Korea, etc. But however in case of India this classification becomes complicated due to yet another form of classification which effectively nullified any form of possible class conflict, 'Casteism'.

What is Casteism in Hinduism ?

Hindu Casteism (from material basis) is a form of hereditary class based on hierarchal order which ensures the flow the capital to the top most caste order and control over modes of production with the help of religious decrees itself.
It's a classification which sets 'permanent classes' based on their 'assigned' modes of production and their position within it's societal superstructures according to their hierarchy set by the 'permanent' ruling classes.
Once you take birth in one of these castes (hereditary classes), you'll be automatically assigned to the forces of production according to the 'order', which you'll do for the rest of your life.
Unlike Class, Casteism severally restricted upward mobility of people belonging to the lower strata of the Caste order, exception being a few handpicked intermediate 'gate keeping' castes loyal to this system. Who were sometimes rewarded by promotion, to become ruling classes itself.

So what changed this ?

Pre-arrival of British colonisation, caste and class were very much indistinguishable. Apart from ruling class there were hand picked bourgeois intermediate classes (vaishya) who were in direct service to the upper caste (sometimes the roles were reversed). The proletarian class (shudra and pariahs) were pretty much same as their european counterparts, lacked capital and hence no control over their means of production, in addition to that they also faced inhumane discrimination because of the caste in which they were born in. Thus were looked down upon as inferior subhumans.
This discriminatory system ensured poverty of the lower caste and prosperity of the top order, despite the lower caste comprising 90% of the population, as well as guaranteed cheap labour and exploitation of the working class 'caste'.
However with the introduction of colonial capitalism by Britishers the modes of production of old feudal order 'fused' with it to become a semi-industrial semi-feudal economy. This is when for the first time the European styled working classes began to take shape in India, parallel to hereditary classes that is casteism. This also introduced new bougeois classes within every community irrespective of their caste, but it's topic for another discussion.
Irrespective of the changes the older hereditary classes still affected this new order, those belonging to upper caste still had access to privileged jobs, better positions and capitals, hence were comparatively in better & prosperous position to that of those belonging to the lower strata of the older caste based order. With the introduction of 'English Education Act 1835' and later 'Macaulay Committee 1854' by Britishers, it further cemented caste order within this new colonial administration, ensuring the hierarchy of the upper caste and exploitation of oppressed castes.

Privileged classes and Caste blind class struggle

In words of Thomas Babington Macaulay:
"We (Britishers) should try to create a class of people, who would work as translators between the people who we are ruling and us, even though they may look like Indians by color; but their likes and dislikes, morals and thinking will be like an Englishman"
As, we can see the British were ok with the casteism as long as it ensured their hold and smooth management of their empire. Sure they helped abolishing the gut wrenching misogynistic practices such as 'Sati' but made almost zero effort to abolish caste system and untouchability (as this would mean they'll loose insubordination of intermediate 'gate keeping' classes that were in service to the British crown.
This new Intermediate urban classes had privileges to several rights within colonial administration in comparison to ordinary classes, however as we have mentioned earlier that with the introduction of colonial capitalism majority of castes further split into binary classes, economically forward and backward classes (note : this was only in case of upper and intermediate castes, majority of oppressed castes were still under poverty). The economically forward classes were infact the privileged classes and the economically backward classes began to fall in the borderline category between Upper-intermediate castes and Lower castes.
And this where 'caste blind' atheism, secularism, and socialism originates from. Who saw capitalism as well as colonialism as cruel system/occupation with respect to their material conditions but failed to emancipate and recognise inhumane treatment faced by those below their classes.
But does this mean that they were barred same as oppressed communities in upward mobility within this colonial capitalism?
No, They still had access to capitalist progress and upward mobility within the society. Simply put,
Casteism filtered oppressed castes from upward mobility within this new capitalist order while capitalism ensured that there will be limited progressiveness within the upper caste friendly circles.
So, whenever an economically backward class member transitioned into economically forward class they carried out his former progressive social ideas to their new found social position within society, while simultaneously giving up their resistance to capitalism. This new default ideology was known liberalism.
British colonial administration in India had many parallels with Apartheid system, but that is debate for another topic.

Consequences of Caste blindness in other '–isms'

With important positions still in access to powerful 'caste blind' and 'casteist' members of the society, they began to clash for power struggle. But they all had one commonality, they didn't gave up their caste identity.
During independence this power struggle began to materialise in two types of ideologies basically – 'caste blind' Leftism and 'meritocracy' based liberalism/conservativism but non of them were seriously against at each others throat as they were getting equal opportunities for their personal betterment.
But, what did the oppressed caste proletariats gain from this? Nothing.
They only got a chance at betterment when Dr. Ambedkar, Jyotiba Phule, Savitri Bai Phule,etc began to make efforts at upliftment of them. However their efforts lacked anti-capitalist approach which emphasized less on capitalist oppression more on caste based emancipation. Which later proved detrimental in their efforts.

Modern day India and caste based wealth inequality

After the ( ½) implementation of Mandal Commission and Naxal Uprising, it began to perturb this harmonic 'clash' between conservatives and progressive left-liberal dynamics as it began to inject more and more political conciousness within the oppressed communities for reservations and political participation.
This is where the concept of savarna meritocracy, comes forth.

Illusion of Meritocracy

Meritocracy is a belief that affirmative actions results in positive results and vise versa in reality is a capitalist hypothesis, a fallacy which fuels the illusion that we are in a merit-based system.
Meritocracies tend to stratify over time. Successful people will pass on their wealth and privileges to their children and can perpetuate a widening inequality of opportunities. It can lead to the misplaced belief that only their talents and hard work account for their success, neglecting the support they have received. — Chan Chun Sing, Min. of Education, Singapore
This meritocratic illusion began to impose predetermined conditions which were biased and opaque on underprivileged candidates, missing to match their criterion in fixed amount of time means ruining of candidate's dreams. In other words it was an underhand tactics to filter out 'representation based/unfavourable candidates' and simply stereotyping them as 'unworthy', disregarding the unequal support the competing candidates got due to their material conditions and pretends that they 'reward the best of the best,' also overlooking the possibility of discrimination faced by oppressed caste students at hands of Upper caste dominant faculties.
After 1960 more and more population of the United States spent more than one-fourth of their entire lifetime in schools, from ages two to twenty-two. As on so many other levels and ways of mass democracy, inflation had set in, diminishing drastically the content and the quality of learning: more and more young people, after twenty years in schools, could not read or write without difficulty. Schools are overcrowded, including colleges and universities. In this increasingly bureaucratized world little more than the possession of various diplomas mattered. Since admission to certain schools-rather than the consequently almost automatic acquisition of degrees-depended on increasingly competitive examinations, the word “ meritocracy“ was coined, meaning that the rising positions to be acquired in society depended on the category of the degree and on the category of the college or university where from one graduate. In reality the term “meritocracy“ was misleading. As in so many of these spheres of life, the rules that govern the practices and functions of schools and universities were bureaucratic rather than meritocratic. It is bureaucracy, not meritocracy, the categorizes the employment of people by their academic degrees. The number and the variation of degrees awarded by higher institutions grew to a fantastic, and nonsensical, extent. Besides being custodial, the purpose of institutional education was now the granting of degrees to provide instant employment. – John Lukacs (At the End of an Age)

Final Conclusion

Provided all factors what we can conclude that Bhagwa Atheism, is form of Hindu upper caste exceptionalism (Agonistic Atheism) which believes in:
•the illusionary meritocracy that their material conditions is due to the (upper) caste in which they were born in.
•hating all religious dogma belonging to all faiths (including hinduism) but still hold dear to their caste based identity, hence the sympathy for hinduism.
•merit based order, refuses to acknowledge caste discrimination and caste wealth gap, sees it as inherited inability.
•convenient switching between Atheism and minimal Hinduism.
•convenient switching between liberalism and conservativism.
Thus, it's a type of agonistic atheism which does more & more damage to gnostic atheist beliefs and their critical thinking potential, making them more and more susceptible towards theism. In case of ex-hindus it's hinduism.
submitted by Crimson_SS9321 to atheismindia [link] [comments]


2024.06.01 14:25 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to computervision [link] [comments]


2024.06.01 14:24 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to learnmachinelearning [link] [comments]


2024.06.01 14:23 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to deeplearning [link] [comments]


2024.06.01 14:21 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: https://github.com/tensorflow/addons/issues/2807 warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66s 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to tensorflow [link] [comments]


2024.06.01 14:07 RamiRustom 💘 Helping People Before and After Leaving Islam 💘 June 2024 (46 new posts & videos)

📢 DON'T MISS THE LAUNCHING OF 'UNITING THE CULTS' ON JUNE 14TH 12PM CDT

Mark your calendars for the 'Uniting The Cults' livestream event on the 50th anniversary of Feynman's 'Cargo Cult Science' speech. Signup for email updates/reminders here & here's the livestream link!
Uniting The Cults' is a non-profit whose purpose is to be an agent of cultural change with a vision of a world without apostasy laws.. a world governed by scientific thinking, where people recognize love as the goal and rationality as the method to achieve it. Our brothers and sisters need our help. They're living in fear, unable to speak for themselves.. so we must speak for them. Here's how you can support your brothers and sisters suffering in fear... 💘
➡️ Dear doubting Muslims
➡️ Dear doubting Muslims and new Ex-Muslims
➡️ Ready to learn more philosophy? 💪
➡️ Uniting The Cults podcast, new episodes since last newsletter
Some posts from last edition were removed because they weren't as good, and to make this list short and sweet. If you want to see them anyway, check out the last edition.
submitted by RamiRustom to exmuslim [link] [comments]


2024.06.01 13:56 genericusername1904 H.G. WELLS’S, THE SHAPE OF THINGS TO COME (1933) VS. 1984 AND BRAVE NEW WORLD

H.G. WELLS’S, THE SHAPE OF THINGS TO COME (1933) VS. 1984 AND BRAVE NEW WORLD

ID, IX. MAIORES. V, CAL. IUNI. FORTUNA PRIMIGENIA.

I discovered this book by complete chance last year – a very old hardback copy was given to me as gift (in a situation which was certainly weighted with the most unlikely of synchronicities), “huh,” I thought, “it’s a first edition of H.G. Wells,” the book itself almost cannot be opened because it is so old and falling apart so I procured a text and audio file of the thing relatively easily and began to read. In hindsight not only for myself but I fancy for the generations of the last fifty years - in all totality, it is deeply strange that this book has not been more widely recognized or taught in schools, as like 1984 and Brave New World, as being the third contender (although technically the second, published one year after Huxley – seemingly written at the same time interestingly enough) in “visions of dystopia” – except that the book is not so much a vision of dystopia tomorrow but a vision of dystopia ‘today’ or rather ‘life as we know it’ of the 19th, 20th and 21st Centuries (endless war, endless pandemics, economic and logistic chaos), narrated from the comfortable and reassuring position of a society far far in the future who have long since revised their culture and solved all of the causes of the problems and become a society of genius polymaths “with (every Man and Woman) the intellectual equal of the polymaths of the ancient world.”
Now, I do not mean here to seem to ‘sweet-talk’ the reader into rushing out and buying this book or to hold it up in the manner of those other books as if it were some ideological blueprint but instead to assay the thing in the natural context which seems to me to be universally unrealized and which presents itself to us as a thing which is plainly self-evident, that is: that in the depressing and miserable dichotomy of 1984 and Brave New World; two extremely atomizing and miserable narratives, that there is also – far more empowering – The Shape Of Things To Come wherein the miserable protagony and antagony of both 1984 and Brave New World might read as merely a footnote somewhere in the middle of the book as an example of the witless measures mankinds old master undertook to preserve their power in an untenable circumstance. In other words, we know all about 1984 as children; we have this drummed into our heads and we glean our cultural comprehension that dictators cannot be cliques of business people but only lone individuals, usually in military uniform, and then we graduate from that to Brave New World to gain a more sophisticated comprehension of the feckless consumerism and ‘passive egoism’ by which our society actually operates, but then we do not – as I argue we ought – continue along in our education with this third book which actually addresses the matters at hand at a more adult level.
For instance, here, from ‘The Breakdown Of Finance And Social Morale After Versailles’ (Book One, Chapter Twelve) addresses in a single paragraph the cause of our continual economic chaos (of which all crime and poverty and war originates from) and highlights the problem from which this chaos cannot be resolved yet could easily be resolved, “adjustment was left to blind and ill-estimated forces,” “manifestly, a dramatic revision of the liberties of enterprise was necessary, but the enterprising people who controlled politics (would be) the very last people to undertake such a revision,”

…the expansion of productive energy was being accompanied by a positive contraction of the distributive arrangements which determined consumption. The more efficient the output, the fewer were the wages-earners. The more stuff there was, the fewer consumers there were. The fewer the consumers, the smaller the trading profits, and the less the gross spending power of the shareholders and individual entrepreneurs. So buying dwindled at both ends of the process and the common investor suffered with the wages- earner. This was the "Paradox of Overproduction" which so troubled the writers and journalists of the third decade of the twentieth century.

It is easy for the young student to-day to ask "Why did they not adjust?" But let him ask himself who there was to adjust. Our modern superstructure of applied economic science, the David Lubin Bureau and the General Directors' Board, with its vast recording organization, its hundreds of thousands of stations and observers, directing, adjusting, apportioning and distributing, had not even begun to exist. Adjustment was left to blind and ill-estimated forces. It was the general interest of mankind to be prosperous, but it was nobody's particular interest to keep affairs in a frame of prosperity. Manifestly a dramatic revision of the liberties of enterprise was necessary, but the enterprising people who controlled politics, so far as political life was controlled, were the very last people to undertake such a revision.

There is a clever metaphor I fancy that Wells worked in to this for the ‘actual’ defacto controlling class of things, that is: not really the politicians (sorry to disappoint the Orwell and conspiracy fans) but instead the ‘Dictatorship of the Air’ which might easily read as the ‘Dictatorship of the Airwaves’ – in colloquial language, that being radio and then television. Certainly we might imagine Rupert Murdoch or Ted Turner or Sumner Redstone (of yesterday) entering into honourable retirement as like the ‘dictators of the air’ of the very last days before the establishment of a one world state – in any case that is how things would work out, as the power of, say, Ted Turner to eradicate a political party in the United States – at any time he wishes – by simply green-lighting coverage of their bad actions relentlessly for months until revolution occurs is a real power of which no other institution possesses nor possesses any means of defence against, i.e. the ‘real power’ in our world to end a war or begin or war or end this or begin that is that power held by the organized press. This metaphor is somewhat of a more mature view, I think, than Wells earlier conception of the press in The Sleeper Awakes (1899) where the press of a dystopian future is visualized as a “babble machine” spreading circular nonsense to preoccupy the citizenry (although this is arguably a true representation of the mental processes of the Twitter and Facebook user, or of the general baby-speak and extremely infantile form of the news reports on the front page of the BBC News website) which is more or less what the press depicted as being in Brave New World also.
However the construction of sudden new realities (or sudden ‘actualities’) presented by the equation of interdependent technological innovations (i.e. the radio and the television in this instance) is mentioned early on in The Shape Of Things To Come in ‘How The Idea And Hope Of The Modern World State First Appeared’ (Book One, Chapter Two),

The fruitlessness of all these premature inventions is very easily explained. First in the case of the Transatlantic passage; either the earlier navigators who got to America never got back, or, if they did get back, they were unable to find the necessary support and means to go again before they died, or they had had enough of hardship, or they perished in a second attempt. Their stories were distorted into fantastic legends and substantially disbelieved. It was, indeed, a quite futile adventure to get to America until the keeled sailing ship, the science of navigation, and the mariner's compass had been added to human resources. (Then), in the matter of printing, it was only when the Chinese had developed the systematic manufacture of abundant cheap paper sheets in standard sizes that the printed book—and its consequent release of knowledge—became practically possible. Finally the delay in the attainment of flying was inevitable because before men could progress beyond precarious gliding it was necessary for metallurgy to reach a point at which the internal combustion engine could be made. Until then they could build nothing strong enough and light enough to battle with the eddies of the air.

In an exactly parallel manner, the conception of one single human community organized for collective service to the common weal had to wait until the rapid evolution of the means of communication could arrest and promise to defeat the disintegrative influence of geographical separation. That rapid evolution came at last in the nineteenth century, and it has been described already in a preceding chapter of this world history. Steam power, oil power, electric power, the railway, the steamship, the aeroplane, transmission by wire and aerial transmission followed each other very rapidly. They knit together the human species as it had never been knit before. Insensibly, in less than a century, the utterly impracticable became not merely a possible adjustment but an urgently necessary adjustment if civilization was to continue.

In other words, then, a global state (or, rather, such power in general held by the press as I see the analogy extending to them as being the ‘Dictatorship of the Airwaves’) was impossible to imagine and completely laughable before the technologies had stacked together to reveal as like in a simple piece of arithmetic which produced a single outcome of the equation; that no sooner had the technologies existed then the thing had become an actual reality – in that 1) unassailable political power had been unthinkingly dropped into the lap of the owners of the press, but that more importantly as consequence that therefore 2) mankind was subject to that power, that is: the situation existed the moment the technologies did – and this whether any living person had even realized it, as I think quite naturally all the time Men and Women invent things that they really have no notion of the fullest or most optimal uses of (“nothing is needed by fools, for: they do not understand how to use anything but are in want of everything,” Chrysippus), e.g. in no metaphor the television was quite literally invented as a ‘ghost box’ to commune with ghosts imagined to reveal themselves by manipulating the black and white of the static until someone else had the idea that there was at least one other use for that contraption.
It is quite strange, also, that in contemporary times we have for ages been heavily propagandized ‘against’ the idea of a “one world state” as if, say, all the crimes and fecklessness that have gone on in our lifetimes are somehow secretly building towards the creation of such a thing – not a thing you would naturally conclude from an observation of those events nor a thing advocated for by anybody (insofar as I have ever heard) but it is a thing which would be the first logical response to ‘preventing’ such crimes from ever occurring again – such as like the already widely practiced concept of a Senate-Style Federation of Sovereign States rather than a hundred or so mutually antagonistic polities capable of bombing themselves or screwing up their economies and creating waves of refugees or mass starvation or pandemics, and so on. For instance, All Egypt is dependent on the flow of the Nile which originates in what is today another country, that other country recently decimated the flow of the Nile by gumming up the Nile with a Hydroelectric Dam; such an outcome would not occur if the total mass of the land itself was governed as the single interconnected economic and environmental system that it is in physical reality of which, when divided along arbitrary borderlines, there is no means to govern the entirety of the region in an amicable and prosperous manner for all as a whole and no recourse to the otherwise intolerable situation but War which is unlikely to occur – as most Nations are comprised of civilized peoples who rightly loath the concept of War – but it is the single and unavoidable outcome to resolve such a situation until that situation has dragged on for decades, causing immense suffering, until it reaches that point of desperation – the matter of Palestine and Israel, fresh to my mind in these days, raises itself also.
Of the matter of War itself, in ‘The Direct Action Of The Armament Industries In Maintaining War Stresses’ (Book One, Chapter Eleven), Wells relays in 1933 what United States President Eisenhower would later remark in 1961 in his farewell address of the dangers of the Military Industrial Complex; albeit far more analytically on Wells part, that: it is not so much the ‘desire to harm’ on the part of the armament industries which sees them engage in unnecessary build-up of weapons stockpiles but that it is simply their business to produce, to stockpile, produce more deadly variants and stockpile the more deadly variants and sell off their old stockpiles to whomsoever rings their doorbell; for instance the on-going War in Ukraine is no different in this regard to the Viet Cong and NATO Warfare in Vietnam in that massive quantities of cheap munitions were necessary for the war to be fought in the first place and massive quantities of munitions happened to exist as a by-product of the Armaments Industries to be dumped onto the warring parties in order to facilitate their macabre impulses at the expense of the citizenry; both at their cost in terms of the debt taken on to procure the weaponry on the part of their governments and in terms of their lives when the weaponry was utilized to the outcome of massive loss of life of a single peoples within a bordered space – a thing of no value to themselves. Simply put, albeit in a very simplistic reduction to the bare basics: the War would not reached such catastrophic inhuman proportions without massive quantities of cheap Armaments that otherwise sat taking up warehouse space for more valuable Armaments on the part of the producer and seller.

In a perpetual progress in the size and range of great guns, in a vast expansion of battleships that were continually scrapped in favour of larger or more elaborate models, (Armament Firms) found a most important and inexhaustible field of profit. The governments of the world were taken unawares, and in a little while the industry, by sound and accepted methods of salesmanship, was able to impose its novelties upon these ancient institutions with their tradition of implacable mutual antagonism. It was realized very soon that any decay of patriotism and loyalty would be inimical to this great system of profits, and the selling branch of the industry either bought directly or contrived to control most of the great newspapers of the time, and exercised a watchful vigilance on the teaching of belligerence in schools. Following the established rules and usages for a marketing industrialism, and with little thought of any consequences but profits, the directors of these huge concerns built up the new warfare that found its first exposition in the Great War of 1914-18, and gave its last desperate and frightful convulsions in the Polish wars of 1940 and the subsequent decades.

Even at its outset in 1914-18 this new warfare was extraordinarily uncongenial to humanity. It did not even satisfy man's normal combative instincts. What an angry man wants to do is to beat and bash another living being, not to be shot at from ten miles distance or poisoned in a hole. Instead of drinking delight of battle with their peers, men tasted all the indiscriminating terror of an earthquake. The war literature stored at Atacama, to which we have already referred, is full of futile protest against the horror, the unsportsmanlike quality, the casual filthiness and indecency, the mechanical disregard of human dignity of the new tactics. But such protest itself was necessarily futile, because it did not go on to a clear indictment of the forces that were making, sustaining and distorting war. The child howled and wept and they did not even attempt to see what it was had tormented it.

To us nowadays it seems insane that profit-making individuals and companies should have been allowed to manufacture weapons and sell the apparatus of murder to all comers. But to the man of the late nineteenth and early twentieth centuries it seemed the most natural thing in the world. It had grown up in an entirely logical and necessary way, without any restraint upon the normal marketing methods of peace-time commerce, from the continually more extensive application of new industrial products to warfare. Even after the World War catastrophe, after that complete demonstration of the futility of war, men still allowed themselves to be herded like sheep into the barracks, to be trained to consume, and be consumed, by new lines of slaughter goods produced and marketed by the still active armament traders. And the accumulation of a still greater and still more dangerous mass of war material continued.

The book is, if the reader has likely already gathered from the excerpts, not written in the style of a protagonal narrative; i.e. not as a story, i.e. no hero and no villain, but as a sort of a Historia Augusta – that is really the most fitting comparison I think of when trying to describe this to a new reader (or perhaps J.J. Scarisbrick’s Henry VIII), that is to say it is written ‘as’ a History in the classical style we are familiar with from the better of the ancient writers, as like Appian or Cassius Dio, but unlike Suetonius or Tacitus it is absent of the sloppy hinging of all bad things on the highly personalized propaganda ad hominem (i.e. blame the fall of empire on one guy) that goes in those narrative works as we are typically familiar with them.
It is, of course, a work a fiction; although Wells did predict World War Two beginning in late 1939-1940 (although he had Poland putting up much better and longer of a fight against the Germans) and various other innovations, beginning from his own day with a true account of events prior to his own day – giving us a valuable account of affairs and actors prior to 1933 which would otherwise not come easily to any of us to discover. But the book, ultimately, is vehicle for the transmission and discussion of these societal (i.e. social, economic, industrial, logistic) matters presented to the audience of the day fresh, in their own minds, from the abject horror recently witnessed in World War One – and the economic catastrophes of which Roosevelts reforms had not yet come into tangible reality (i.e. relief for the poor, public works projects such as the motorways across America) as is discussed in that other seemingly little known H.G. Wells literary offering in his face-to-face interview with Josef Stalin the following year in 1934 (something which I think is of far more historical value than say, Nixon and Frost or Prince Andrew and Emily Maitlis), so as to ‘avert’ another crisis and pluck from the ether a seemingly alternate trajectory of where Mankind might at last get its act together. This ‘novel’ (thought it seems strange to call it that) ought be read, I would advise, in conjunction with ‘The Sleeper Awakes’ (1899) and also the (actually very depressing – I would not advise it) short-story prequel ‘A Story Of The Days To Come’ (1897) – set in that same universe – which, perhaps it is because I am English, seems to me to be a black horror show of the reality that we actually find ourselves living in this far into an actually dystopic future – or perhaps yet with the ‘strange windmills’ powering the mega cities that this a future yet to come (no pun intended); the broken speech, the babble machines, the miserable condition of the Working Class and their consumption of pre-packaged soft bread, the desire to flee the urban sprawl into the dilapidated countryside and make a little life in a run-down house with tacky wallpaper peeling away … ah, forgive me, my point is that ‘our condition’; i.e. those of us literate in English, is quite analogous to the condition of the central characters in those two stories; a culture dulled intellectually to the point that they can barely speak or think, being appraised and assayed by ourselves; those of us simply literate, as to render our commentary stuck as to seem as mutually alien as like Caesar in Gaul. However, it is in the context of the frame given to us in ‘The Shape Of Things To Come’ that we might gain a degree of sanity about this self-same situation; to study and lean into that dispassionate quality as to discern the nature of things as they are and recognize how important this quality is in relation to Well’s ultimate outcome for the best possible position of Humankind far far future, that is: that of Humankind’s vital intellectual capacity, and that the most striking message of STC, beyond all we have mentioned in this little overview, is that intellectual capacity in and of itself.
For example, when we consider the ‘actuality’ of the power of Turner or perhaps Zuckerberg in his heyday, for instance, we consider a power fallen into a Mans lap by an accidental stacking of disparate technologies created not by himself but of which possess a power utterly dependent in that same equation upon on a population being ‘witless’ in the first place and so led slavishly by the “babble machines”. However you cut it, reader, the great uplifting of Humankind to a standard of autonomy and intellectual prowess – not held by an elite but possessed by All People – is a thing both intrinsically self-sufficient within our grasp for our own selves and is certainly the prerequisite for political matters in that intellectual capacity of the voting public determines entirely whether a public is tricked or foolish and gets themselves into trouble by undertaking some obvious error or whether they are immune to such trickery and foolishness in the first place and that their energies and time are spent on more valuable pursuits. It seems to me that our contemporary society has done away with the notion of good character through intellect and that we live with the outcome of this; being shepherded by emotional manipulation and brute force because our society at large is treated as if we lacked the verbal and intellectual toolsets to understand anything else – moreover possessing no means to discern whether or not what is forced onto us is right or wrong; truth or lies, and so on. Such a society as this, again it seems plain to me, is ‘any’ dystopia because it is the baseline composition for ‘all’ dystopia; as like the foolish dogma of an out-dated ideology for example rests itself upon a large enough contingent of the public being either treated as if they were or in fact are “too foolish” to discuss or think a thing through, so a dogma is poured over them like concrete creating, in turn, intolerable circumstances as the dogma, tomorrow, becomes out-dated and suddenly instructs them to do foolish things, as like in the “Banality Of Evil” (read: Hannah Arendt) as the character in all serious perpetrators of inhumanity who insist, with a confused expression on their faces, that they were just doing their job – and this ‘quality’, of extreme ignorance, is the composition of the culture where such ‘evil actions’ occur.
I mean here that in STC we have on one hand a very in-depth account, very serious reading, to graduate the reader out of the depressive, atomizing, disempowering, conspiratorial milieu and mire of ‘life’ presented to us in 1984 and Brave New World, but that we have at the same time the very resonant harmonics that one does not need to “wait around for a distant future utopia” to “solve all the problems” but that the tools to do so are well within our grasp at any time we so choose and of which such an undertaking constitutes the foundation stones and tapestries of that future utopia which, I think, could be said to “meet us half-way” in many of these matters, as like we reach forward and they reach back and then those in the past reach forward and we in the present reach back; that is anyway what it is to learn from the past and anyway the answer to “why the Grandfather sews the seeds for trees from whose fruits he will never eat.”
Valete.

ID, IX. MAIORES. V, CAL. IUNI. FORTUNA PRIMIGENIA.

FULL TEXT ON GUTENBERG OF H.G. WELLS ‘THE SHAPE OF THINGS TO COME’ (1933)
https://preview.redd.it/9l7yl9hx8y3d1.jpg?width=490&format=pjpg&auto=webp&s=4d5a4109fb8e2193b94a6e244d92d4ec5b7b84a7
https://preview.redd.it/37vvsroy8y3d1.jpg?width=740&format=pjpg&auto=webp&s=e62ef5e11c1c4222d6f99ffebe82b3dd706cbc2f
submitted by genericusername1904 to 2ndStoicSchool [link] [comments]


2024.06.01 13:52 Thin-Pool-8025 The last four matches between Real Madrid and Borussia Dortmund

submitted by Thin-Pool-8025 to soccer [link] [comments]


2024.06.01 13:49 965501ku Should I get septoplasty & turbinate reduction?

Should I get septoplasty & turbinate reduction?
Hey everyone, 24M here with symptoms of brain fog, disturbed sleep, waking up tired / feeling tired throughout the day.. I’ve had some allergies ever since I was a kid but it was nothing major and never caused any sleep problems. Fast forward to 2 years ago when I was 22 and the symptoms hit me like a truck. It’s gotten progressively worse and I’ve even developed dark circles under my eyes. Doctor says since I don’t have sleep apnea (as confirmed by an at home + in lab - check earlier posts on my profile) then it must be some nasal obstruction causing limited airflow when I sleep… he wants me to get this surgery since I failed medical treatment (was taking an allergy pill + nasal steroids). I’ve attached my sinus CT scan report but I’m on the fence because of the horror stories I’ve heard before regarding this surgery. Would appreciate any advice I can get..
submitted by 965501ku to UARS [link] [comments]


2024.06.01 13:44 Altruistic-Light5275 I introduced a new temporary sprite for the 🐔 in my colony sim and am actively working on seamless tile texture blending. It will be ready soon!

I introduced a new temporary sprite for the 🐔 in my colony sim and am actively working on seamless tile texture blending. It will be ready soon! submitted by Altruistic-Light5275 to IndieGaming [link] [comments]


http://rodzice.org/