Old english name tattoo generator

Tattoo Designs

2011.11.26 03:58 lorenlogan Tattoo Designs

This sub is for sharing and discussing tattoo designs, whether it's your own tattoo, work you've done, or asking for opinions about a tattoo you want to get. All tattoos must be by a professional unless you're asking how to cover up a past mistake, scratching/unprofessional tattoos aren't welcome here.
[link]


2008.01.25 04:36 Podcasts - discover, discuss, review

podcasts: a subreddit to discover, discuss, and review podcasts with other podcast enthusiasts. As part of this mission, podcasts is curated to promote respectful and on-topic discussions. This is not a place to promote your podcast.
[link]


2011.11.04 01:23 Kawaiijake Fullmetal Alchemist

Come post anything related to Hiromu Arakawa's Fullmetal Alchemist anime and manga franchise! Questions, discussion, analysis, fan art, cosplay, quality memes, etc. are all welcome.
[link]


2024.06.01 14:32 g3thic [F4A][Literate] Longterm Roleplay Partner!

Hello again! I’m not sure if you’ve seen my other posts about a fandom roleplay but this one is gonna be about any roleplay in general, fandoms included. This’ll be pretty detailed and I’ll let you know the parts if you want to skip ahead (I suggest you don’t). If you don’t wanna read all of this, then don’t. This was made for people willing to read blocks of paragraphs and maybe even respond with their own.
INTRODUCTION
My name is Hina. To know more about me, I hail from Japan and I have been an avid writer ever since I moved to the States when I was 11. My second language is English but I believe it’s been pretty good. I recently turned 22 years old and I'm female. Talking about age, I would be comfortable with you being 17+ and preferably at least 20. Roleplaying with minors isn’t a big thing for me, I apologize. Im currently in GMT+1 timezone but that’s temporarily. I’m here until July and then i’ll be in the west coast, PST timezone. Let’s see.. What other information can I give you? I would say I enjoy skating, basketball, watching tv shows and anime, and reading. My favorite anime is Nana and Death Note.
GENRES + FANDOM
I am not looking for a specific roleplay. I would say i’m skilled in all genres. Sci-fi, fantasy, horror, apocalypse. All of that. Even slice of life, though that depends on what type of plot exactly. To be more specific on each genre, starting off with fantasy, I'm more used to high fantasy and mythology. I’m not that great with medieval, unfortunately. DnD based role plays aren’t really fit for me and I struggle playing with species like ogres. Just putting that out here. For fantasy, I don’t have any specific ideas.
Sci-fi is the genre I have more skill in. Most of my roleplays are based off of them! Specifically, I’m fine with all subgenres of that.
For other genres like horror and apocalypse, I do have some taste. I really like monsters and creepy things from the horror genre like vampires and all of that and I even have my own idea set up in older times dealing with vampire lords and hunters and all of that. I also enjoy eldritch type horror. I also like that one sun genre of it, like video game horror? I’m not sure how to describe it. I also forgot if it even has an official name or if it’s just something used to describe the horror genre. I like Resident Evil, so maybe that’ll tell you the type of horror I usually enjoy. I do have a developed idea of something more eldritch horror.
More on fandoms! To get some other things down, I usually only play OC unless the character you want me to play is one I know more about and I'm more comfortable playing. The fandoms I like in the more anime way are Jojo’s, Nana, Death Note, JJK, AOT, Haikyuu, and probably more. I’m well versed in the Jojo’s, Aot, and JJK fandoms but less knowledgeable on Death Note since i’ve only seen it once. Other fandoms i’m in include ATLA, TLOK, Harry Potter, Resident Evil, Marvel, DC, and many more.
CHARACTERS
I tend to use character sheets to describe my character, these usually consist of names, background, and personality. More so on appearances, I prefer using animated or drawn references than real life people. I enjoy good enough references where I get the idea of how the character would look like.
The types of characters I write are either the lone wolf type that has some sad past which leads them to want to join someone for a redemption arc or the bubbly character who is the one that brings the mood up and is usually seen as trustworthy and of that kind.
I like all types of tropes, especially enemies to lovers or rivalry. I also really enjoy opposites attract as a whole from either opposite personality or something else they would be opposites in. Enemies to lovers takes my heart, though. I love seeing the characters go past the urge to ultimately hate each other and/or go past their usual way of disliking the others lineage or upcoming.
REQUIREMENT
I think this is my last paragraph on the roleplay. It’s the most important, at least. Requirements. All roleplay searches come with them. Or at least that’s what I heard! But don’t fret, there isn’t much.
I’ve seen this as one of the most used requirements, and I agree with it. As someone who’s first language wasn’t English, I understand that you may not be great at it. But please, I do require a partner that at least has proper use of grammar and punctuation. You don’t even have to use big words or anything, just at least know where to put your periods and the placement of your words.
My second requirement is for you to be LITERATE! Please. I’m a big writer, I tend to ramble on and tend to write more than what I thought I would. (like i’m doing right now) I write multiple paragraphs from the starter until the scene relaxes. I also understand that sometimes writing big blocks of words every response is tiring or boring so I don’t expect it all the time, at least after the starter has been made and in more important scenes. Dialogue also cuts my replies shorter.
Please please please be polite in OOC! We may just be role playing together but kindness goes all ways. If we do include OOC, I enjoy talking about many things. My day, movies, games, funny moments and stories, all of that!
I think that’s the end to this wonderful journey of an ad about my search. I hope you are still here, fellow writer! I would LOVE it if you reached out to me! This wasn’t all for nothing, right!
But don’t leave yet! I do have a passcode. I know this was a jumble of words and rambling but I still have to put one in. I heard that there’s a lot of people on here that don’t read things fully and miss out on rules or information! But.. Just because you read through this all, I’ll gladly give you options on the passcode! Also please put in an introduction of yourself! Don’t think “Oh maybe I shouldn’t bother this person with too much to read”! I like seeing big blocks.
PASSCODE:
What’s your dream country to travel to and why?
OR
Who’s your favorite TV show / Cartoon / Game / Anime character?
Feel free to pick both! Now, that’s all from me. Please don’t put your request as just “Wanna rp”!
submitted by g3thic to roleplaying [link] [comments]


2024.06.01 14:31 Potential-Lack-5185 Fan-Wars in the SUB. And are people wrong to extra-support their POC faves?

Would super appreciate it if everyone reads this long-ass thesis of a post with an open mind and to the end...its me being vulnerable in an anonymous reddit sub because i think these things matter in the larger framework of a uniquely special show and the larger world we all as humans live in. My last post in this sub was downvoted to hell which btw totally ok...and i dont depend on reddit currency for my livelihood so its all good. I just care about my posts being read...specially this one.
Some context:
I'm very new to this community and the show as well. Currently have time off work and going ham on reddit and venturing very scared into the deep dark of world of rabid fandoms and small niche communities. Getting a little burnt along the way but also learning from some super talented fans with their own eye-opening ideas and funny posts. Im also an aspiring writer and well...some of my best ideas have come from reddit lurking on cool and active subs with bridgerton sub being new for me. Jet lagged, anxious and just a tad bit antsy, I'm trying to go down and deep and maybe open some deeper discussions here.
I am Indian and lived primarily in India my whole life and now for the last 10 years between London and Canada (why am I sharing all this like am I going to tell you my weight, my height etc too...No but hopefully you'll understand this revelation as you read ahead without me spelling it ou)
So I've been trying to figure out why fan wars about this show prick so personally.
I think fandoms anywhere in every country are full of insane people, obsessive chronically online people who find comfort through living through their faves..your beyonce stansm your rihanna stans, your Taylor Swift stans, your harry styles stans, your k pop fans, etc efc..
This show also has some rabid fans..But there is a conversation I want to open up only because I think it's important...Is fighting extra hard for your poc fave because you know their representation in pop culture is limited, their getting jobs is limited, subconscious biases exist even though we have been making great strides and even eradicated overt biases in employment, in accessibility, in government service etc etc ..
Our ideas of beauty are framed by the extensiveness and longevity of colonial rule across the world.
there are shit people across all ethnicities religions and genders and natjonalies. Barbarians, conquerors, invaders, pillaging villages raping women destroying livelihoods suppressing the peasant class, mutilating and diluting cultures exist in Africa, Indian subcontinent, East asia, middle East etc so not just Britain. Basically being brown, black or East Asian and Middle Eastern doesnt grant you moral superiority over white people and you can be racist and casteist and colorist and bigoted and biased and just all around shitty as a black, brown, east asian middle eastern just as much as white people.
So now that thats settled, lets look at specific issues that POC only face because that eventually streams into the show and its politics and its discussion as well.
But for whatever reason ...the British were able to colonize on a scale that the other countries and ethnicities were able to.. African and the Indian subcontinent's conquests and invasions were limited in area and time period...their scale of conquest limited to a small radious around their own countries...and not foreign lands travelled via sea etc. As a result and only as a result and not because and I cannot emphasize enough that black, brown or east asian invaders, rulers etc were morally superior, kinder, less assholes...the culture that became dominant was white british culture..or european culture...if Indians had conquered the world, they would have imposed their culture on the lands and countries rhey invaded, if different african countries invaded the world to the extent of the british colonists, same thing...the dominant culture that would have been forcibly imposed by the people of african ethnicity would be their culture... Black skin would have been considered beautiful, white fair skin would not have been the standard....thats history....thats something we know to be true even through modern wars...the victor dictates...But that simply didnt happen.
it's not because they were Africans or Asians or basically non white people morally superior or I don't know were like we don't invade and conquest, we so good. But because that's history. They just didn't manage it. thats simply our current historical reality. Being that the country that did manage to conquer the world or close to 9/10ths of it were european or more specifically British white nationals..which means as would happen even if another nationality had managed to invade the world, the culture that is dominant across the world is a west/white slanted world..
Colorism as a concept didnt exist until colonization: Why would it? If everyone existed under the same harsh sun, had the same skin color with minor variances...why would they think fair is more beautiful. Its because they wouldnt...the first blond haired blue eyed people would have been introduced to India via early traders of east india company in India and dutch colonists before them...Its like if everyone around you is brown, you assume thats just how everyone else looks like...thats the only reality you know...you simply havent seen anyone look any different..you havent seen blue eyes or anything else....same for africans and within that framework as humans are wont to do you create ideas of beauty.
So yes. POCs across the globe had just as arbitrary ideas of beauty as white people (chiense mutilated young women feet cuz small feet more beautiful, in africa genital mutiliation) because humans are humans and humans mess everything up...and animals are so much better...but color as a construct is a colonial one and one which managed to find deeep deep roots because of the extent of the rule, the sheer longevity, For context, India has been independent for the last 75 years..from over 200 years of colonial british rule...Not even the length of the life of a single human. Not even as long as Britains former queen was alive. It will take some more time for that in fighting and ideas of colorism deeply penetrated from the inheritance of our rulers to get dismanted...and totally thrown own...because you can freee from literall bondage more easily, the mind takes longer to adjust and form independent thought. But it will happen.
White people are not more or less racist than POC. Thats not a thing. There are shit people and you can find shit people anywhere...I have shit people right in my own home country, bigots, destroying the diversity of India by bullshit tactics...
But as it stands...because this cultural superiority of British and white people took over, the framing of beauty, of whats fashionable, of whats cool, of whats civilized is all seen through the lens of white culture...eating with hands which many cultures across the world do uncivilized unhygienic, paris fashions and made in italy...mark of excellence...made in India and China...cheap, low quality, scammers and shit people, (thats not to say shit people and scammers dont exist in India and China and there arent industries of scammers across these poorer nations but there is also denying the equisite craftasmanship across fashion and beauty in both countries-China and India.
Brown skin bad, even darker skin even bad etc etc...We wear clothes that our colonial masters did across the world, you wont find people in china wearing Chinese clothes, same for India, african countries etc...everything from clothing, to beauty, to furniture, to houses everything indigenous was changed to a foreign ideal from our rulers. The same would have happened whoever whichever ethnicity had invaded the world...like i said above..
Now and thanks for reading whoever read till here and I hope a lot of people did-on to to the show...
There is this frequent refrain and accusation of oppression olympics that I read about when it comes to this show, in general online discourse and also this insanely disgusting article fat shaming Nicola Coughlan in The Spectator.
And i really really want to open up this conversation...cuz race gets discussed a lot on this sub and other bridgerton subs and therefore a perfect place to have conversation...Do Non-POC really believe in the concept of opression olympics and that all kinds of biases are equally faced...
So I was overweight in my teens...I had a friend in a wheelchair...my life was hard and I was bullied but surely surely I do not think that my plight was the same as my friend in a wheelchair...In class 10th, a close friend's father passed away, the same year my grandmother passed away, surely, surelly, I dont think we are experiencing the same pain the same setback to our life...10th was when in India we have something akin to A levels in Britain and SAT in the USA..
You know why I didnt think any of this and why if I had i was wrong, because there is a hierarchy to opression, to loss, to struggles. There simply is. Pain is pain...but privilge is a thing and some loads just lighter compared to others. I could lose weight and get over the opression I was facing because of my weight in school. My wheelchair bound friend didnt have that option..I wouldnt get my grandmother back...but my fathers loss would have a more immediate affect on my life.
So when show fandom compare Jonathon Baileys struggle to Rege Jean George or say Victor Ali, they are simply pointing out that yes while Jonathon Bailey is gay...there is a difference between being white and gay and being black and gay or simply harder being black. And therefore there is a hierarchy. Why POC are less loath to crticize the average acting of their POC faves because they know the opportunities for them are limited. Lets do an exercise name 5 shows led by a POC in the USA or Britain or Canada-any POC..brown, black i dont care. And im not talking about black or brown actors in a show or film, im talking leads...
Also biases are self perpetuating....When black or brown led films fail, the opportunities immediately dry up because it shows that its just fairness and the numbers dont lie..Except this is not maths.....Maths would be first making equal amount of shows starring black or brown people, and then comparing...Now if the end result is white led shows do better-that would be correct math...but ratios and comparisons and statistics need to first start with an even scale...thats not even me taking about diversity...thats just math formulas...
If lets say there are 500 actors who are white in hollywood and 200 who are Black in hollywood, and lets say all 200 black actors are shit...would you say the statement...god white actors are so much better than black actors...no cuz the maths is not adding up...you compared 500 actors who were white of which 300 were excellent 200 bad and you made that into the conclusion that white actors are better because you were comparing only with 200 black actors to begin with...the actual math formula yieled equal number of bad and good actors..
Why people defend Kanthony harder...or wanted Simone Ashley to be promoted as much as Polin is because (lets go with my Math analogy again) all things being the same) unless Nicola and Luke are really bad actors compared to Simone Ashley, theyll still have it easier in acting...they simply will. they wont have to change their confusingly, long foreign sounding name, they wont have to work at assimilation in other ways, lets say they were muslim they wouldnt have been trying hard to sell themselves as Im just as liberal as you, im not a threat. And I love Luke Newton and Nicola Coughland-both seem like throughly, likable, personable, kind hearted beans..
Now my own experience. I have stated in an earlier post on this sub that I'm a Shonda Rhimes fan...because I have personally benefitted from the diversity she has included so naturally, so elegantly in all her shows. My own ideas of beauty, have been tested and transformed. In fact I know exactly the moment it happened...watching Christina Yang as a 14 year old...and thinking god meredith is so beautiful and I want to see more of HER and Izzie but then seeing christina again and again and again over each episode over hundreds of hours of binging, school, college, masters, big move outside India, my own constant this show..my brain was soft mushy and impressionable and christina yang, korean Sandra Oh became suddenly but actually slowly and then all at once became beautiful to me...I dont know when it happened but like the book Colin telling Penelope I dont know when or how and why others dont see it but you ARE beautiful in the book carriage scene. I found her hot, I found her cool. I cared to learn more about her Korean mother and her Jewish father and it didnt matter that the show never covered that culture..Christina was atheist and could handle her shit..even around racists.. But I still wanted to learn more about HER..a woman I simply didnt think was beautiful comparable to meredith and Izzie..me an impressionable 14 year old..just made that turn because of a show.,..a fluffy...not that deep soapy as hell show.
But it was only possible cuz I saw 24 episodes day in day out,...over many many years....for that to happen. I had many more years of falling asleep to dawsons creek joey and dawson, joey and pacey, gilmore girls rory and jess rory and dean rory and logan..i imagined and dreamt of windswept Mr. Darcy...and of course I luckily had my own countrys pop culture cuz I grew up and lived in a country where I was repped plenty eveywhere... So i found bollwood heroes hot and I found White americans or british people hot...my brain simply didnt have a framework for East Asians hot. And I needed to seem them constantly and frequentlyn and in Hot front and center, desirable covetable-their culture, their families all respected, admired again and again and again over many years for that switch to happen.
So why do we fight for our POV faves.. fight hard (some fight really ugly too which....they are not my people ( as in I dont know them) so sorry for that)....because we know how much more repping they need...we know how much MORE MORE MORE important it is to see POC culture done right...because it simply is not accessible,...cannot be to people outside..
K POP is making waves super..., people love themselves some MANGA and kimchi...so good and progress...but a lot of countries pop culture is in the native tongue...people outside of it cannot access it...in the same way one would English language content-books, podcasts, news, films, shows etc etc. And hollywood does our culture wrong, played up for laughs, stinky curry, stinky indians, scamming Indians, uncivilized heathens, oh so funny that Sofia Vergara and her funny accent (The Ellen Show), niche shows that dont become popular behemoths because it stars all ethnic casts-Fresh Off the Boat.
You simply will not understand how brilliant and gorgeous my country's embroideries, temple art, clothing and fashion, actors and sculptors, museums and writing and authors are...not because YOU are racist..but because you dont speak-my language and my culture isnt dominant or wide spread enough-insert again-my above explanation of colonialism and dominant and suppressed cultures. History made one culture the most widespread...again not because white people, bad and racist...brown and black people good and not racist and benevolent..but they simply were the victors...for centuries...plural.
We want to fight harder for Rege and defend him leaving the show (im not even black but when you live outside of your home country, all POC seem like a united underrepresented group and you find kinship everywhere) because any potential cancellation or quote unquote unlikeability and hes difficult accusations would hurt him far far more than Nicola Coughlan, Luke Thompson, Newton, Phoebe etc. for the same crimes. And there just arent enough of us to begin with to lose even one. For what is quite honestly a mid Netflix show with seeds of promise but a lack of ambition.
submitted by Potential-Lack-5185 to BridgertonNetflix [link] [comments]


2024.06.01 14:30 beardedGraffiti My valid uk visa is on my old passport which is expired now. My old passport has Father's name on it. However, my new passport has Husband's name on it. Will I face any issue due to this?

Nationality: Pakistani
Visa type: Tourist
Posting on behalf of my cousin please let me know if you require anymore information.
submitted by beardedGraffiti to ukvisa [link] [comments]


2024.06.01 14:30 Mission_Star5888 Our Happiness are Moments We Need to Always Remember

I had to go to the grocery store today. Was running out of food for the dog, short on milk and just needed some things. I haven't been out for a month because of my step dad.
I can't trust my step dad. He will just sit and watch TV while I am gone. He has dementia and 85 years old. Not too long ago he went outside in the shed. He slipped and fell. I didn't know he was outside he never tells me. His son came by for dinner and found him while I was outside vaping. I did walk around looking for him but just in the wrong places. But anyway I got out today because my step sister in law came by for awhile to keep an eye on him.
I went to Weis grocery store. While I was shopping I was getting my cat her canned cat food. It was taking me awhile because looking for different food for her I didn't find. I look up and this older guy is just standing there waiting for me. I told this guy he should have said something and I moved out of the way. We got talking about our cats. He has like fifteen plus cats that he takes care of, I have one. But I have my cat for a reason I believe
First of all I have had two cats. My first cat was about 18 years ago. She was a black cat that my neighbors supposedly were taking care of. My neighbors back then, at least the guy, were jerks. The father laid out in the sun in his bikini bottom and didn't do anything all weekend. I felt sorry for his wife and kids. They always had cats running around outside. This black cat came to me one day and I found some food for her. She kept coming back. She became my best friend.
I went through some very hard times. I even thought of suicide. You know what kept me from doing it? My cat Midnight. Just seemed like everytime she came to me I had peace. I go out for a cigarette she would come to me without me calling her. She would come because she knew I needed her. A few times she was sitting right outside the door. When she passed away she was in my arms. She was like my best friend, an angel at that. I really do believe God sent her to be my friend.
Then about a couple weeks before she passed away she ended up getting under the porch. We had a board off because we had to do plumbing work under there years ago and Midnight liked going under there in the winter. Now we had to get her out so she didn't just die. When we did we kept her inside and took care of her. A few days after this calico cat, her name is Reese, walks up to me outside. She's rubbing my legs and meowing. She just followed me inside. I kept her in my room until Midnight passed on. Now she is all over and a climber. My mom passed on a year later from pancreatic cancer. A lot of other crap happened in that year and if God hadn't brought Reese before Midnight passed on I don't know where I would be today.
I believe everything happens for a reason and what we decide changes our future. That's why we need to make sure we stay on a good path and not a bad one. Personally I don't think we need to try to be perfect because that's impossible but use common sense. There is always a better way and having faith is what helps you to get there. Sometimes we just need a little help and we get a friend
submitted by Mission_Star5888 to OpinionsMatter2Me [link] [comments]


2024.06.01 14:29 Independent_Wash_487 honestly wishing I wasn’t pregnant right now. having horrible thoughts right now.

I have so far been up all night as I can’t fall asleep for nothing. I have been stressing so much lately and there is nothing that can ease the stress.. on top of the thoughts of what can happen to the baby long term medical wise from all this stress. It’s honestly so much eating away at me and I just don’t know what to do with these thoughts. I am going to just write away everything that’s been eating away at me.. that is why this is going to be very long as it’s been a lot so far… I got off birth control in December as life was going amazing and it was giving me awful migraines as I was rearing my third year being on nexplanon. I knew that there could be a possibility of getting pregnant and honestly with how life was going the thought of potentially getting our boy as we have two girls right now was really exciting me. I have been working from home and recently got a huge raise and things were looking great. A month after getting off the birth control I started feeling weird, that intuition feeling came. Shortly after, my job that I had been with for almost 3 years randomly lays off a lot of employees including me with no notice or anything. I thought it would be a long term job but they eventually grew financial issues. Of course I didn’t want to abort this baby just because my job laid me off. I am a very independent person so of course I would do anything to make sure me and mine are straight. I start back doordashing full time from 9 am to 9 pm and I stay an hour from the nearest city so the stress of wear and tear on my car has always scared me but you gotta do what you gotta do as a parent. Of course they say I’m eligible for unemployment and I’ve tried endless times to file for it and they always denied me due to work searches as I wasn’t applying to the right places when I was applying EVERYWHERE. No matter what I put. Every week would get denied cause of this and the phone number is impossible to reach someone. So I’ve given up on unemployment. We lived off of our tax return plus DoorDashing which I really wanted to save this money. The work search has been so stressful. I got a seasonal job and did amazing at it working up to 18 hours overtime one day but they over hired people so there was way to many people to consider hiring everyone full time so once it ended less then a month of working there that was it. Until I got a call from my dream job which my mom and my bf mom both work there and it pays way higher then what I’ve made on top of providing a hybrid schedule too. I felt it was a stretch applying but my resume looked really good so I went for it. I got an interview with them and the interviewer loved me and said he felt really confident in me and would like to offer me the position. Of course I’m overly excited cause this is my dream job. After filling out the onboarding and going to scheduled onboarding appointments they state there was only one issue stopping my onboarding which was a previous account with them that had restrictions on it that I was not aware of. I trusted the wrong “friends” back in high school 7 YEARS AGO with my personal information not knowing any better and they did fraud with my information and of course it fell back on me. I even paid back every owed penny from the fraud to clear my name to move on from that mistake. They never told me they also proceeded to put restrictions cause of it. So my onboarding was put on hold until I handled the restrictions. Fast forward a month later of struggling making ends meet. We have no more income tax money.. but at least I finally receive a response saying the restrictions would be removed. So finally we receive great news. I let the people know and they proceed to let me know that someone will reach out to me in 5-7 days and it has now surpassed that time frame with no response and I am just so scared that they won’t follow through.. It is now June and I have been struggling to get a full time job since February and I am holding onto the little ounce of hope that this job will follow through like they said… Holding onto that ounce of hope as doordash grew very stressful I decided to pull all of my retirement out from my previous job to put into savings in case we need it for an emergency especially if the car were to go out on us. on top of our apartment lease renewal coming up. We did NOT want to renew the lease because this apartment has treated us HORRIBLY since we moved in. We came from a clean bug free apartment due to the rent randomly increasing twice since we moved in it and moved 30 minutes to be closer to my OLD job and his family. We moved into this apartment because I had a work friend refer it to me saying it was her first apartment and she fell in love with it. Not knowing we were going to get the worst apartment building probably out of the whole complex. Since we moved in we could not look at the apartment until after the lease was signed and given the keys. We moved ONE box into the place and came back days later to move the rest of our stuff. We moved that box and SO MANY roaches scattered from it and we knew instantly we got played and that now all of our things were going to be roach infested now. Fast forward almost a whole year later we have tried endless methods to get rid of them such as boric acid, orthene, endless traps, endless raid bottles, ONTOP of the apartment buildings monthly pest control coming in doing whatever they do and WE STILL HAVE THEM. No matter what we do they are not leaving as I believe even tho we try different methods it won’t matter if everyone else in the building aren’t trying to get rid of them. They are probably being constantly rotated between the other apartments around us so it’s useless. I DO NOT WANT TO BRING MY NEW BORN BABY INTO THIS APARTMENT. On top of all the plugs in the walls has blown in the living room. I let the landlord know about this and they sent there only technician to check it out and they didn’t know what they were doing. They switched the power surge switches on and off and it fixed the plugs but they proceeded to go back out the next day. We haven’t even been there a year and the refrigerator has been tearing up like crazy. The whole bottom of it has ripped off cause apparently the adhesive is so strong when closing and opening that it slowly tore its own frame off. We had to use gorilla glue to glue it back on and it’s so far worked. On top of the rims around the door got so many rips in it. The door holders on the fridge can’t hold anything heavy or else the whole shelf falls off same as the door handles on the freezer so we have to carefully move things around it and put only certain things in those spots or else it’s all going to the floor. The first red flag of the apartment is there are no washer and dryer hook ups and that is honestly the least of our problems with this place.. the bolts on the dish washer are so tiny and unscrewed with time and randomly the whole dish washer completely fell down from being poorly connected to the counter. Whenever you open it to put dishes in you have to hold the racks or else the whole dish washer will fall forward and they will roll out with all the dishes in it. I’m so over this place and we have BEEN ready to move out. So once the 30 days came up I contacted the landlord about the 30 day notice that we were going to move out before the lease renewed. She proceeded to tell me with no emotion that they required a 60 DAY notice in advance prior to our lease end date and that our lease already renewed for another YEAR. She didn’t even try to help us out and did not provide any kind of notice or reminding about the 60 day notice. She just kept repeating that we signed the lease and it is written in the lease. She said if we move out we will be responsible to pay the months rent for each upcoming month until a new tenant moves in and takes over the lease which is very unlikely as they have full control on whether they want to move someone new in or continue to bill us the monthly rent.. So now we are trying to figure out how we are going to work out this New obstacle/road block and I am already halfway through my pregnancy. All of this stress has been eating me alive for the whole beginning of my pregnancy and it ALL came out of the blue. This is not how I pictured being pregnant with my third baby and I feel completely miserable right now. I am struggling doordashing all day just waiting for any kind of good news. We are thinking about ditching this apartment and going to stay with my mom until we find a full time job to afford a new apartment as we just want to start the process of this landlord potentially searching for a new tenant to take this burden off of us. We do not wish to pay two apartment rents as my credit is amazing and I know if they put any of this apartments owed rent when we move out on my collections it will ruin my credit.. we have been growing it for when we are ready to get our first home. So I know not paying it is not an option for me. I’ve just been wishing this ongoing nightmare will finally come to an end and I can finally receive any kind of good news. I was sooo excited about this pregnancy but now I have zero excitement for it as I have been through endless stressing and roadblocks the whole pregnancy so far. I am afraid that this stress and pain will affect the baby long term.. I do not wish to bring this baby into this apartment as I fully wish to be in a new upgraded apartment bug free when the baby comes in October my birthday month.. It’s just super hard holding onto any kind of motivation right now and the pain is slowly eating me up inside.. I just really needed to get all of this off of my chest and hopefully maybe I can finally get some sleep right now. If you read all of this.. thank you for listening and all I can really say is check on your people cause you never know what they could be going through as life can hit so random at times..
submitted by Independent_Wash_487 to pregnant [link] [comments]


2024.06.01 14:27 CinderpeltLove Professor’s weird yes/no questions during class

I am in grad school. My program is tiny so everyone has every professor multiple times for multiple classes. I am taking my last class with this professor. The course is an advanced psychology course.
I am a quiet straight-A student. I barely talk in this professor’s class cuz their classes are late at night and I am exhausted by then.
In the past, this professor didn’t call on me much but this semester, they are starting to call on me more often to ask these weird yes/no questions that catch me off guard. My slow processing brain has no idea how to generate a fast response so I end doing that small smile and shrug I do whenever I am uncomfortable and don’t know what to say. However, this type of stuff is happening enough that I am starting to prepare for it.
For example, while lecturing about how different ppl define the word “trauma,” out of nowhere they called my name and asked if it would be traumatic for me if I fail this class?
(I get all As with this prof so I am not worried but my initial thought in that moment was “wtf…no…why are you asking this?”).
Another example: They were lecturing about how severe childhood trauma can result in poor social skills and “off” facial expressions in adults (it wouldn’t shock me if this professor thinks trauma causes autism, ADHD, etc). Again, they call on me out of nowhere and ask if I would be friends with someone with a weird smile?
I don’t think my professor is doing these awkward yes/no questions with other ppl. I have been diagnosed with ADHD but not sure about autism yet. No one at my grad school knows about my ADHD diagnosis.
What do you make of my professor’s behavior? Based on the details mentioned, how would you interpret it?
submitted by CinderpeltLove to aspergirls [link] [comments]


2024.06.01 14:27 EmmyEtc Progress 7 weeks after 1st session + how long should I wait?

It’s been 7 weeks since my first session, and I’m happy about my results so far! My tattoo is 9 years old. Treatment done with Q-switched Nd:YAG.
Before I started the process I was determined to take a break over the summer because I was worried about healing during hot and sunny weather. But my healing after the first treatment was easier than expected, so now I’m wondering if I should rethink my decision. I know waiting longer is beneficial, but is that also true after the first session, or is it more important to wait longer later in the process when more of the ink is gone?
I will be waiting for at least 12 weeks until my second session (which would be early July) as that is my tech’s policy. But do y’all think I should go back at the 12 week mark, or wait until after summer?
submitted by EmmyEtc to TattooRemoval [link] [comments]


2024.06.01 14:26 prespy4400 Example of gpa calculator or score board output based on input...

Example of gpa calculator or score board output based on input...
public class Program { public static void Main(string[] args) { string studentName = "Sophia"; string course1Name = "English 101"; string course2Name = "Algebra 101"; string course3Name = "Biology 101"; string course4Name = "Computer Science I"; string course5Name = "Psychology101"; int course1Credit = 3; int course2Credit = 3; int course3Credit = 4; int course4Credit = 4; int course5Credit = 3; int gradeA = 4; int gradeB = 3; int gradeC = 2; int course1Grade = gradeA; int course2Grade = gradeB; int course3Grade = gradeA; int course4Grade = gradeB; int course5Grade = gradeC; int totalCreditHours = 0; totalCreditHours += course1Credit; totalCreditHours += course2Credit; totalCreditHours += course3Credit; totalCreditHours += course4Credit; totalCreditHours += course5Credit; int totalGradePoints = 0; totalGradePoints += course1Credit * course1Grade; totalGradePoints += course2Credit * course2Grade; totalGradePoints += course3Credit * course3Grade; totalGradePoints += course4Credit * course4Grade; totalGradePoints += course5Credit * course5Grade; float gradePointAverage = (float) totalGradePoints/totalCreditHours; int leadingDigit = (int) gradePointAverage; int firstDigit = (int) (gradePointAverage*10)%10; int secondDigit = (int) (gradePointAverage*100)%10; Console.WriteLine(@$" Student Name: {studentName} Courses Credit Grade {course1Name} {course1Credit} {course1Grade} {course2Name} {course2Credit} {course2Grade} {course3Name} {course3Credit} {course3Grade} {course4Name} {course4Credit} {course4Grade} {course5Name} {course5Credit} {course5Grade} Final GPA: {leadingDigit}.{firstDigit}{secondDigit} "); } } 
https://preview.redd.it/ox5ml8ybey3d1.jpg?width=2977&format=pjpg&auto=webp&s=b834df67693b6abb371ecfeb4ce06b246f2399bc
submitted by prespy4400 to csharp [link] [comments]


2024.06.01 14:26 Radoslawy This may be the best food i ever cooked

This may be the best food i ever cooked
beans, dried tomatos and roszponka (idk english name) gnocchi
submitted by Radoslawy to TheRatEmpire [link] [comments]


2024.06.01 14:25 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to computervision [link] [comments]


2024.06.01 14:24 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to learnmachinelearning [link] [comments]


2024.06.01 14:23 Adept_Pea3975 lump on staffy’s back?

lump on staffy’s back?
hi all, i recently moved into my dads house and have noticed that his pet staffy has a skin lump on his back. my dad says that it’s been there for years. from the time i moved in it hasn’t grown or shrank. its soild and black with fur growing over it and doesn’t seem to bring him any pain. i can touch it and he doesn't react. the dog is 7 years old and quite happy and healthy, i walk him every few days and he doesn't have any signs of any other health issues. does anyone know what it could be? or had a similar issue with thier staffy? i have encouraged my dad to take him to the vet but because it doesn't bother the dog, he sees no reason to. i would take him myself but he is not registered in my name.
submitted by Adept_Pea3975 to StaffordBullTerriers [link] [comments]


2024.06.01 14:23 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to deeplearning [link] [comments]


2024.06.01 14:23 NTP9766 Using PS to migrate VDAs from one Manual catalog to a new Manual catalog

We're getting ready to migrate from ESX to AHV, and one of the steps is to migrate existing VDAs from their existing catalogs to new ones that use AHV hypervisor connections. Doing it manually in Citrix Cloud works fine, but obviously, that's not feasible for 6500 VDAs. I've been trying to do this via PS, and am getting stuck. In a nutshell, this is what the script is doing:
$VDAs = Get-Content .\VDAs.txt $SourceCatalogName = "OldCatalog" $InputFile = Get-BrokerMachine -CatalogName $SourceCatalogName -MaxRecordCount 100000 Where { $_.HostedMachineName -in $VDAs } Export-Clixml ".\Migration.xml" $VMs = Import-Clixml -Path $InputFile $HostingConnectionDetail = Get-BrokerHypervisorConnection Where { $_.Name -eq "NewHostingConnection" } $Catalog = Get-BrokerCatalog -Name "NewCatalog" #Turn on Maintenance Mode Set-BrokerMachine -MachineName $VM.MachineName -InMaintenanceMode $true #Remove from the existing Delivery Group Remove-BrokerMachine -MachineName $VM.MachineName -DesktopGroup $VM.DesktopGroupName #Remove from the existing Machine Catalog Remove-BrokerMachine -MachineName $VM.MachineName #Add to the new Machine Catalog New-BrokerMachine -CatalogUid $Catalog.Uid -HostedMachineId $VM.HostedMachineId -HypervisorConnectionUid $HostingConnectionDetail.Uid -MachineName $VM.SID #Add to the Delivery Group (same as previous one) Add-BrokerMachine -MachineName $VM.MachineName -DesktopGroup $DeliveryGroupName 
The script completes successfully, and I see the VDA in Citrix Cloud in the correct MC and DG. However, I cannot query for this VDA using Get-BrokerMachine, and the Power State shows as Unknown. I have to be missing something obvious, and I've stared at this long enough where I need another set of eyes. Any chance somebody sees where I'm going wrong here?
submitted by NTP9766 to Citrix [link] [comments]


2024.06.01 14:22 Rupieeroo IT sector in Norway questions

Hi there,
My family and I are considering a move to Norway (I have some family there already). I am English and currently employed as a DevOps Engineer. We're currently trying to work out if such a move is viable for my family. I have a 2-3 year old child and my wife is an EU citizen. I have a couple of questions that I was wondering if anyone would be able to answer about the work situation for IT in Norway:
1.) Are there many IT jobs? I can see a number on LinkedIn but am not sure where else I should be looking. Plus is there much in the way of green tech? That would be my preference!
2.) Is there much in the way of English language jobs? Obviously I'd like to learn the language if we live there but I'll need to start somewhere!
3.) What is the rough salary estimation for a mid to senior DevOps/Software engineer? As I need to know i'll be able to support my family for a good quality of life if we move.
and 4.) Is it tough to get a 2-3 year old into the childcare/school system? That is a potential issue where we currently are.
Any information or anything additional info would be greatly appreciated! Thanks!
submitted by Rupieeroo to Norway [link] [comments]


2024.06.01 14:21 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: https://github.com/tensorflow/addons/issues/2807 warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66s 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to tensorflow [link] [comments]


2024.06.01 14:20 Polypedatess Is this even bad enough to have ptsd from

I'm just so tired all the time, it literally feels like I can sleep all day. I have a normal sleep schedule, but everyday I just feel so exhausted. I have dark circles under my eyes and I have no energy to do anything anymore. I just lay in bed all day and want to rot. I feel suicidal, I just want to die all the time and it's getting worse. I get nightmares of him, not of what exactly happened but just of different sa from him. I feel like there's no point in going on anymore, I don't think it's going to get better. I don't exactly know what it's like to have a flashback, but I think I've experienced them. I have really bad maladaptive daydreaming, but I don't think it's that. It's like I'm there again, I can't control it or stop it or rewind it. It's like it's happening all over again and that I'm there and I can feel it. When it's happening I just sit there and cry and I feel like screaming but I obviously can't do that so I have to hold it in. My head feels like it's burning constantly too, like the back of my head feels so fucking warm and hot. Like my brain is melting. And I just want to die and I'm so tired I just want to sleep and never wake up again.
•The one big thing that makes me feel valid is that, when I was 11, my stepdad fingered me in my bedroom. I won't go in to too much detail or anything, it's unimportant. But the entire time he just stared at me and everything was silent, like he was waiting for my reaction. Our relationship has always been odd, so I wanted it. But eventually I got scared and told him something, I don't remember what it was but it got him to stop immediately and he apologised too. I don't remember much after, as in I don't know if he left my room or I left first, but I immediately went to the bathroom. Which was when I discovered I was bleeding.
•Around this time, for some strange reason I would repeatedly say to him "fuck me daddy." This would either be in person, or over messages. I remember once, when I was in school, I messaged him that. He told me to stop in case one of my friends saw. I don't know why he didn't tell me to stop for other reasons.
•One day, after telling him that in person, we were in my parents bedroom. I was sat on his bed and he was in front of me in his weird chair. He then started going in to detail about how I wanted him to fuck me, I can't remember exactly what he said, it was like I zoned out. Everytime I try to recall it now it literally feels like bugs start to crawl up me, I don't understand why. I remember the last part, and his really disgusting hushed and gentle voice. He asked if I wanted him to "cum inside of me", or he was just explaining how that would finish. I'm not really sure.
•Still around this same time period of me being 11-12, I would ask him to 'squish me.' The reason why we would call it that is because I would be on my back, my legs would be up all the way to where my head is and he would be on top of me in a way that would 'squish me'. Basically like that one sex position. I would usually be wearing my school uniform when that would happen, so a skirt. During the 'squishing', he would push down on me, so our crotches would basically be against eachother. I don't know why, but I would continuously ask him to 'squish me' and during it I would even say the whole "fuck me daddy" thing. Only recently have I realised that he was probably just pretending to fuck me.
•Other things had happened around that age too, like how we would talk about how many times we masturbated a day and compare it to eachother. Sometimes if I was abruptly going to my room, he would ask if I was going to go masturbate, since we were 'close like that' I would tell him. He would often recommend me NSFW Instagram model accounts. I was once tricked in to sending feet pics to this guy, which really isn't that serious and whenever I brought it up with friends they find it fucking hilarious. But the detail I always leave out is that, I did bring that up with my stepdad and he proceeded to tell me that he already knew. Which means he was spying on me through the crack of the door. If that already didn't bother me, I don't understand why he just allowed me to send those pictures, if he was watching why the hell didn't he stop me?
•I'm pretty sure this also happened around the age of 11 as well, recently, a memory resurfaced but I barely remember it. Basically, I was sucking on his neck. I don't remember who said it, but either him or my mum spoke up and laughed, saying that I needed to stop otherwise I would "give him a hickey." The reason why I wouldn't be surprised if my mum was in the room at the time is because she doesn't care about what he does. She knows everything and just doesn't fucking care.
•I'm very sure that, around that age, my parents begun to expose me to their loud sex. I wouldn't be surprised if it started even younger, however. Obviously, I tried to bring it up with them at the ripe old age of 11 and my mum immediately shot me down with a "it's natural." This only stopped recently, around this year, because I had a big panic attack over hearing them and my mum finally felt guilty. I started getting panic attacks over it the minute it started, maybe the panic attacks were a sign of the trauma when I was younger, but I'm convinced it is now. I heard it so many times that I began to get paranoid every night, I would start to hear it even if they weren't upstairs (I sound crazy, I know.) I would get so anxious every night in case I would hear it, to the point I started to really resent them from it. I know fine well I could just go to sleep before them, but sometimes they even woke me up with it, on numerous occasions.
•I'm convinced my stepdad wanted me to hear it. Around the time of it finally stopping, I got mad because i was hearing it again (I'm unsure if it was due to me hearing shit or they actually were) but it caused me to take my bedding and go downstairs to sleep. In the morning, I was rudely awoken to my stepdad slamming the door open and storming past. He's not usually like that when people are sleeping, so it instantly gave me the impression that he was pissed off and the only reason I can think of is that he was angry I wasn't there to listen.
•He used to tease me for my paranoia to. As a way to discourage them from getting intimate, I would leave my door open at night. This happened around this year, but I was doing that again and I messaged my stepdad if they were actually going to sleep. It then somehow turned to him making a dig about how he knew I gets anxious at night and when I asked why he sent me "In case me and your mam have sex. 😜" Before, I tried to resolve this issue by begging them to just tell me if they were gonna have sex or not so I could sleep downstairs (because I was gonna find out the hard way anyways.) And they kept on refusing? Which just gave me the impression that they wanted me to listen more.
•Around 11 again, he would often tell me details about his and my mums sex life. Like how he was always good at pulling out and the only time he would wear a condom is right when he was about to finish. But the reason why my sister came to be was because he just failed to pull out that one time and my mum refused to get an abortion. Another time, he went on about how him and my mother had sex during her period and how they had to use towels and they didn't enjoy it because it was too messy.
•I don't know if he did things before the age of 11, my memories are very faded and it's like there are major gaps throughout everything. I'm worried that he did, however. When I was very young, I remember having no accidents at all during the night. But then, around the ages of 9, I would have an accident basically every night and would get a lot of water infections. I know that's a classic sign of child sexual abuse, but I don't want to jump to conclusions or anything.
•Another reason as to why I believe more things had happened to me than what I know of is because I always seemed to know what sex was when I was young, but I wouldn't know the name or anything specific about it like how to get pregnant or what cum was. Though, even though I didn't know what it was, it was like I always thought about it, I could never not think about sex, it was disgusting. This stayed until I was around 13. I remember where I even asked my 'boyfriend' at the time, we were both around 8, if he wanted to have sex, and I have no idea why.
•Over the years, he would flash me frequently. Everytime, I would always believe it was an accident because he'd never acknowledge it, besides from that one time which he always jokes about it and blames me. Everytime he would flash me, it would either be because of a convenient hole in the crotch of his pants or because he was wearing very lose fit shorts and it would just be hanging out. The more I think about it, I'm very sure he would have been able to feel such a thing, especially when it was poking out of the hole, but it was like he was just oblivious.
•For some strange reason, when I was younger, I would make comments about small dicks. I don't know if I was commenting on his dick specifically, but he would always say the same thing. "Width matters more than length."
•Recently, around 16-17, he made a joke about how he listens to me masturbating. Once he noticed how shocked I looked, he then went on saying about how my vibrator is too quiet to hear.
•Around 17 again, I went to use the shower. The shower I use is the one that's connected to my parents room. When I locked the door, he got madish and started making comments about it. I had to defend myself, saying how 'the door would open on it's own if I didn't lock it'. Eventually, he backed off.
•I don't understand the point in the fucking door and lock to my bedroom anymore. Whenever I decided to lock my door, my parents start shouting at me through the walls, asking why I locked my door. My stepdad barely knocks, it's like a tap and he doesn't even wait sometimes. I remember seeing a past message from an old friend saying how he tried to walk in when I was changing and that he knew I was changing. I didn't explain myself, I really wish I did because I don't remember this.
•(Around 17.) We were messaging eachother and it somehow turned in to him hinting if I saw this one animated video, it was a porn one. I said no, and to that he sent me a screenshot of it. It wasn't anything bad or anything, just the start of it and nothing was revealing, he then asked if I was sure. And how he was surprised that I hadn't.
•(Around 17.) I don't really get my period, we still don't know why. But as I was getting a lot of blood tests, my stepdad was trying to check things off the list of what it could be. One of those being that my opening is just extremely tight I guess, because he asked if I ever tried penetrating myself. I admitted that I did, but I couldn't get it to exactly go in. Which he then decided to make a comment saying how It's just my 'technique'. I wonder if the only reason he asked that was to see if I ever tried anything out of morbid curiosity.
•(Around 17 again.) He randomly bought me dildo's once, I didn't ask him for them, he just bought them for me and it was wildly uncomfortable. Once he gave me them, he asked if I wanted him to show me how to use them. I said no, which he then said something about how if I ever did then I could ask him. I worry what would have happened if I did say yes.
•When I was around 14, I went glamping. I ended up having to share a bed with him. One of the nights, I woke up to his hand just on top my crotch. I tried grabbing it and moving it away but it just fell back down on to it. I don't know if he put it back there on purpose. I still question if it was a dream, I'm very sure it wasn't because I remember going back to sleep, but it still just bugs me.
•Around 17, I was upset for some reason and he was comforting me. During this, he randomly grabbed the inside of my thigh. I usually just wear a shirt and boxers, so he basically just grabbed my naked thigh but I don't know if he was doing it in a comforting way.
•Usually when I draw, I have my knees up to my chest so it's easier to use my tablet. Considering what I wear for pyjamas, I can always see him looking at my crotch when he comes in to my room. If he really can see everything I don't understand why he doesn't just tell me to put my legs down.
•He's made a lot of uncomfortable jokes over the years too. One of the ones that upsets me sometimes is that, when he was measuring me for a binder, I was constantly moving around because it was uncomfortable since I was just in a sports bra. As he was leaving, I think I told him about how it was uncomfortable for me or something along those lines. He then turned around and shouted "oh come on, it's not like i was fingerings your pussy or anything."
•Very recently, I asked him if I looked okay before going to college. After a bit of back and fourth he said "I wouldn't kick you out of bed, maybe you could find someone in college who would do the same."
•Other times when I asked him if I looked okay, he'd go on tangents about how my ass is great or how he would date me or be too nervous to talk to me if he was my age.
•One of the more recent jokes was when I dropped a mayonnaise lid on my lap. Nothing got on me, but my stepdad turned to me then turned to my mum and shouted "if anyone starts accusing us, just tell them it was mayonnaise!" Or something like that.
•I remember after we watched the new mean girls film, he started going on saying about how he wanted to rewatch it for the Halloween seen (if you know you know) for the 'panty action'. Which rubs me the wrong way because I'm very sure the girls are supposed to be around my age.
•I'm very sure he also made this fake account, pretending to be one of my old groomers that I tried to cut off, just to message me about nsfw topics and ask for pics. It's a whole long yap about paranoia and just suspicions so I won't get into it though. If I tried to provide all the evidence I have, it'll take forever and there's no point.
There's definitely way more things that he's said, joked and done. But I'm only now beginning to realise that they're not okay. Even when I was younger, I was sort of uncomfortable around the jokes so I would just zone out, leading me to not remembering them now.
I probably will never accept that what happened to me was bad, or a big issue. Especially due to the 'lovely' people on here. Thank you for telling me immediately that I was a liar before you even knew what happened, that I shouldn't blame an 'innocent man', that you hope he comes in and rapes me to the point I split open and bleed. Thank you for telling me that my parents were just trying to promote a sex positive household, that some of the things were questionable at most. Thank you so much for saying I deserved it because I didn't send you pictures. You all made me feel like shit and I'm probably never going to tell people in person what happened to me, out of fear I would be ridiculed due to how much of a baby I'm being. I wasn't raped, so I have no place to cry or even think about it. I'm being overdramatic.
If you even read to this point, you're an angel.
submitted by Polypedatess to abusesurvivors [link] [comments]


2024.06.01 14:19 Environmental-Win259 Finding peace after being verbally threatened.

Hi everybody.
My apologies for the long text, and my apologies if my English isn’t correct, it is not my mother tongue.
I’ve been recently stepping into Buddhism and mindfulness. I am 37 and was diagnosed with autism and adhd two years ago, I’ve overwon a heavy substance addiction three years ago.
In September I was dating a girl for about two months. She broke up with this guy about three months before I met her. I know this guy already for a long, as we live in the same city, and was never aware that they used to date. This person is not even an acquaintance.
In contrast; I know his sister better as we both went to mostly the same events in the city. A complete opposite, and a very thoughtful person compared to her brother.
The girl I was dating told me about how invasive this guy was on her life, how verbally aggressive he could be, and how he hit a guy with whom she was drinking something on a terras after they broke up.
Now.
This guy has threatened me twice. Once on New Year’s Eve, and yesterday. Both times he said I was lucky that we were at a public space, otherwise he said he would have knocked my teeth out. This while grabbing me by the neck and holding his head close to mine while saying this in a very aggressive hateful way.
I remained calm during this situation, as I am an advocate of non violence. I reported this to the bar staff, as it’s a bar I’ve used to work, and headed home.
It’s a pitty that this situation happened, there that I don’t have a lot of social contact and wanted to say hi to my old colleagues and maybe have a drink by myself.
As I notice the early effects of my practice on my daily life, I am unable to work through this situation at the moment. I am worried for aggression in the future, as this could be dangerous for me, as well for him. I am aware that if someone keeps pushing my ‘buttons’ that I can lose controle over myself, and i can become unaware of my own strength. I would never want this to happen, but in my mind I’m am preparing for this situation, verbally, and physically.
On Monday I am going to the police to report his verbal aggression and threats towards me, as I feel this is me being one step ahead of a situation I don’t want to be in.
How to deal with this mindfully, as this occupying my mind right now.
As I am aware that being on the spectrum makes this situation more complex to handle emotionally, I seek advice from all of you, neurodivergent and neurotypical sisters and brothers.
I want to apologise in advance if some things I’ve written down are not in the way of The Eightfold Path. Please make me aware of this, as I am willing to learn.
submitted by Environmental-Win259 to Buddhism [link] [comments]


2024.06.01 14:16 Then-Requirement6381 Multi Gen Disaster. AITA?

I’m at the end of my rope and I am hoping to hear your thoughts.
AITA for having concluded that the only way to protect my son’s psychological safety from a bloodied grandpa is to remove us from this front row seat?
submitted by Then-Requirement6381 to AgingParents [link] [comments]


2024.06.01 14:16 Seline_Kirotashi First time kitten owner here, with a few questions regarding normal kitten behaviour

So I got a kitten for my 17th birthday yesterday (her name is Mitsy and she's the cutest little thing ever!) and she's also my first ever pet so I'm really worried that I'm gonna screw something up and to hopefully avoid that I have a few questions! She's 6 months and 3 weeks btw

  1. How often should I change the litterbox? I've been doing it immediately after my kitten uses it but I'm pretty sure that's not necessary.
  2. Is it okay for her to be licking her stitches from her desexing surgery thingy? I assumed it wouldn't be okay but the pet store I got her from didn't seem to care too much about it since she wasn't wearing a cone. Also, I'm not sure how old the stitches are but I have to take her to the vet in two weeks to get them removed if that matters
  3. She keeps trying to eat her litter (its absorbing litter) and I keep trying to distract her and/or firmly telling her 'no' and getting her to drop it but I'm not sure how to get her to stop completely or even if this is normal behaviour. It's probably super dangerous though
  4. Mitsy keeps randomly making a weird croaking sound and it's probably just a hairball but she also ate a dust ball a few hours ago before I was able to stop her and she ate something crunchy (probably litter) that I couldn't see a little bit after that, but like. I'm still a bit scared.
  5. I'm keeping the litterbox and food and water in my bedroom along with Mitsy for now because my Nanna doesn't want her wandering around the house at night, but I'm planning to move it all to the bathroom when she gets accustomed to my house and we get a cat fence for my Nanna's bedroom (she dislikes cats, the absolute monster!), but I was thinking that keeping the litterbox in my room is a bad idea and I feel kind of guilty about having her locked in my room all night and her waking up before me and probably wanting to explore and play
  6. Any tips on getting kittens to feel more comfortable around stairs? I want to let her know that downstairs is okay too (she seems to be okay upstairs now) and I've been trying to coax her down one step at a time using treats but its not really working. I also don't want to pick her up because I don't want to accidentally touch her stitches and hurt her.
  7. Since Mitsy is already litter trained, its okay for me to give her treats whenever right? And on a similar note, is it normal for kittens to avoid treats when they're in a new home?
  8. She hasn't done anything 'bad' yet, but when she does, what's a good way of teaching her not to do that? I don't want to yell at her or spray her with water or anything because that's mean and I want to be a nice cat mum and she doesn't know any better
  9. Is getting a cat harness worth it? I'm never going to let her out into the backyard or outside because of how easy it is for her to run into a possibly not so friendly cat or escape the backyard which is literally bordering a road that has a lot of cars and she doesn't know the area so she can't find her way back home. I'm thinking that maybe I can take her on walks like a dog but my family said that's stupid so I'm not sure
  10. I've been playing with her a lot with a feather toy (sorta, its actually a star on a string and it had a moon as well with bells but those came off) and she keeps jumping really high and twisting in midair which looks like it should be painful because of the stitches (can you tell I'm really worried about the stitches?) but maybe she's just really good at hiding pain?

I think that's it, please please please answer even one of these if you can and any assorted kitten raising tips are greatly appreciated!
submitted by Seline_Kirotashi to cats [link] [comments]


2024.06.01 14:13 RalseiTheGoat8 Smash or Pass competition (kinda) tournament thing - contestant 132! Hermit.

Smash or Pass competition (kinda) tournament thing - contestant 132! Hermit.
Second post of the day and our 132 contestant is...
Gives you a permit
Here are some copy-paste ground rules based on my own judgment and YOUR voting. (v.1.0 may get updated)
  1. All obvious minors will be brutally murdered to not participate in this thing (mostly just Clover, Kanako (oh wait), Kanako's friend, and Karen (hospital koala) ).
  2. Don't get too weird in the comments. I for one do not care, but let's not make other members and admins uncomfortable, shall we?
  3. Participants will arrive in the (more or less) order we meet them in the game.
  4. There will be 2 posts per day and I will delete posts once I recorded the results to not flood the sub.
  5. Despite the name it doesn't have to be a sexual thing. Romance? Aesthetic? Vote based on whatever makes YOU comfortable and ignore the wording.
  6. Results will be based on a percentage of smashes compared to passes. For a more open explanation find my old post about it or just wait till the end of the tournament, you'll probably figure it out.
  7. I will be taking the battle sprites if available.
  8. Also ignore the unfunny commentary I leave next to the vote options, they're just for my own amusement and to add something unusual to a plain voting :
View Poll
submitted by RalseiTheGoat8 to UndertaleYellow [link] [comments]


2024.06.01 14:12 Slidebyte101 [STORE]: 🧧 --- Slidebyte's Ship Shop --- 🧧 (Main Store) Rare Ships, Unique Paints, Legacy Alpha Game Packs & Awards, Store Credit, Middleman Services, Account Liquidation Services, MSR Nightrunner, Free Hangar Fees Award, Subscriber Items & More 🛰

[STORE]: 🧧 --- Slidebyte's Ship Shop --- 🧧 (Main Store) Rare Ships, Unique Paints, Legacy Alpha Game Packs & Awards, Store Credit, Middleman Services, Account Liquidation Services, MSR Nightrunner, Free Hangar Fees Award, Subscriber Items & More 🛰
Greetings fellow Citizens o7! Long time backer and trader here.
I've been forced to condense the store to only the rarer items due to ANOTHER bug with Reddit's new UI that prevents me from editing the store pages to update. If you're looking for a CCU or something specific feel free to ask. If you're hesitant on a price also feel free to ask, many items are being sold on someone else's behalf so flexibility may vary.
Keep an eye out for "SALE" tags where the seller has decided to sell at a loss, less than market value or extremely rare / limited items.
-------------------------------------------------- ORDER PROCESS --------------------------------------------------
You will need to provide your Paypal email for the invoice as well as BOTH your RSI email & RSI name that the item/s get sent to.
Please familiarize yourself with CiG's gifting rules & ToS on their website.
Please understand that some of these items are in buyback and prices are subject to change without my knowledge. If this happens, I'll let you know, and we can reevaluate the transaction.
Abbreviations:
OC = "Original Concept"
obo = "or best offer."
OST = "Official Soundtrack"
LTI = "Lifetime Insurance"
Please understand that this is "not" my job, but I will respond as quickly as possible, usually within a 24hr period. Please allow a minimum of 24hrs for a response. Thanks for understanding!
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ IMPORTANT ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you're interested in anything or have any questions about items, transfers, CCU chaining or Star Citizen in general, please don't hesitate to shoot me a message. Happy to talk about anything SC related!
If you're new to Star Citizen and thinking of buying a game package, feel free to use my promo code to get extra goodies, promotional ships & /or credits added to your account: STAR-YC6L-5ZTY
If you're interested in building a fun community, in need of an Org to join & folk to play with, feel free to check out ours: Crypteian State Syndicate [CRYPTEIAN]
https://preview.redd.it/xv5et7n19y3d1.jpg?width=1500&format=pjpg&auto=webp&s=301e19436d68b031c8332b78f85e32ac9c4e3d0c
TABLE OF CONTENTS:
  1. Game Packages / Ship Packs
  2. Stand-alone ships
  3. Unique Paints
  4. Store Credit / Armors / Weapons / Other
  5. Accounts: Space Marshal with Unique MSR Night Runner, OG Legacy Backer Accounts w/Hangar Fee Rewards etc. (Ask for details)
If you don't see something you're looking for let me know. There's about 3-4 pages unlisted.
https://preview.redd.it/pc3pylf29y3d1.png?width=2900&format=png&auto=webp&s=28975e08ca3509a1a92e380271a67244c3f03103
Game Packages / Ship Packs:
Title: Notable Contents: Insurance: Total After Fees:
Arbiter Legacy Alpha Game Pack (JPx2) 325A, SC, SQ42, Legacy Alpha, Star Map, OST etc. LTI $169.20 (SALE)
Best in Show 2951 Hercules C2 Herc C2 + (IAE Leather Jacket & Unique blue / black BIS Livery) 10y $479
Best in Show 2952 Mercury Star Runner MSR & name reservation + ('52 Coin & Unique red / black BIS Livery) 10y $319
Best in Show 2952 C8X Pisces Expedition (x5) Pisces Expedition + ('52 Coin & Unique red / black BIS Livery) 10y $60
Best in Show 2952 Scorpius Scorpius + ('52 Coin & Unique red / black BIS Livery) 10y $299
Best in Show 2953 Corsair Corsair + ('53 Poster & Unique purple iridescent BIS Livery) 10y $299
Best in Show 2953 Vulture Vulture + ('53 Poster & Unique purple iridescent BIS Livery) 10y $209
Best in Show 2953 600i Exploration 600i & name reservation + ('53 Poster & Unique purple iridescent BIS Livery) 10y $569
Constellation Andromeda + SQ42 Legacy Game Pack Revel & York Hangar, PTV, 10,000 UEC, Manual, SQ 42, SC, Soundtrack, Star Map, Making of SC, Constellation poster, Cot, Work Bench, Fishtank Mk 1, Vindel, Oshi, Thorshu, Grey Ribbon Fish (Vario Vittas) 6mo $359
Digital Freelancer Legacy Alpha Game Pack (JP) Freelancer, SC, SQ42, 5k uec, Digital Engineering Manual, OST, Star Map, Legacy Alpha LTI $257.60 (SALE)
Lightspeed Legacy Alpha Pack (JP) Unique Origin Racing Suit, F7C-M (CCU'd), SC, SQ42, Digital Star Map, OST, Legacy Alpha, Etc. LTI $389 (SALE)
Next Generation Aurora Game Pack (JP) Aurora Legionnaire, SC, SQ42 etc. LTI $99 (SALE)
Pioneer Pack Pioneer, Greycat Estates Geostack-X Planetary Beacons, UEE Land Claim License Estate Parcel, Outpost Construction Material 10y $1099
Spirit Collection C1, E1 & A1 Spirits 6mo $455
Weekend Warrior Pack (JP) Model II Arclight Sidearm, SC, SQ42, F7C-M, 5000uec, Star Map, OST LTI $306 (SALE)
100i Foundation Festival Starter Pack (Warbond) 100i + Unique Limited Foundation Festival Paint, SC Digital Download 6mo $75
https://preview.redd.it/p9uiexw39y3d1.jpg?width=1680&format=pjpg&auto=webp&s=8747d32781e80754cdb64fa073a1f5b7ac28d434
2. Standalone Ships:
Title: Notable Contents: Insurance: Total After Fees:
Apollo Medivac ILW Edition - 10y $310
Aurora Legionnaire 2944 (Original Concept) (JPx2) - LTI $69 obo (SALE)
Ares Inferno ILW Edition - 10y $280
Ares Ion ILW Edition - 10y $280
Archimedes P72 (Original Concept) (El) Poster / Model LTI $49 (SALE)
Avenger Stalker (Original Concept) (JP) - LTI $79 (SALE)
Banu Defender - 6mo $220
Banu Defender (Original Concept) (El) Poster / Model LTI $239 (SALE)
Banu Merchantman - 6mo $660
Banu Merchantman Anniversary Edition (El) - 3y $440
Blade - 6mo $300
Carrack 2949 Edition Carrack name reservation 10y $550
Carrack 2952 IAE Edition Carrack name reservation 10y $660
Carrack (Original Concept) (El) Poster / Model, Anvil Manufacturer Shirt, Anvil Hat, Carrack Plushie, Name Reservation LTI $699
Caterpillar ILW Edition - 10y $363
Crucible ILW Edition - 10y $390
Corsair ILW / IAE Edition - 10y $275
Constellation Phoenix 2015 Anniversary Edition (El) - 3y $399 (SALE)
Eclipse Showdown Edition - 24mo $335
Eclipse (Fl) - LTI $330 (SALE)
Endeavor IAE Edition - 10y $390
Endeavor Hope Class (El) (Medical Bay & Hangar) 3y $509 (SALE)
Endeavor Biodome Pod IAE 2950 - - $115
Endeavor Collider Pod IAE 2950 - - $140
Expanse - 6mo $165
Freelancer (Original Concept) (JP) - LTI $140 (SALE)
F7C Hornet Heartseeker Edition - 6mo $234
F7C Hornet Wildfire IAE Edition - 10y $215
F7C MkII - 6mo $195
F7C-M 2943 Super Hornet (Original Concept) (El) - LTI $199 (SALE)
Fury ILW Edition - 10y $62
Fury MX ILW Edition - 10y $62
G12 IAE Edition - 10y $73
G12r IAE Edition - 10y $73
G12a ILW Edition - 10y $77
Galaxy (Original Concept) (Fl) Galaxy + Unique Concierge Protector Livery LTI $425 (SALE)
Galaxy Cargo Module - 6mo $80
Galaxy Refinery Module - 6mo $90
Galaxy Med Bay Module - 6mo $100
Gladius Valiant ILW Edition - 10y $125
Glaive IAE Edition - 10y $390
Genesis Starliner (Original Concept) (El) Poster / Model LTI $459
Hammerhead (Original Concept) (El) Poster / Model / Name Reservation LTI $729 (SALE)
Hurricane ILW Edition - 10y $235
Legionnaire ILW Edition - 10y $135
Liberator - 6mo $633
Lynx ILW Edition - 10y $70
Mercury Star Runner Fortuna Edition (MSR Name Reservation + Fortuna Livery) 6mo $300
Mercury Star Runner ILW Edition (MSR Name Reservation) 10y $300
Mule ILW Edition - 10y $50
Nautilus ILW Edition - 10y $825
Nox IAE Edition - 10y $45
Nova ILW Edition - 10y $120
Orion IAE Edition - 10y $720
Orion (Fl) Cutter Concierge Groundswell Paint ($430 melt) LTI $389 (SALE)
Orion (Original Concept) (El) Poster / Model LTI $649
Perseus ILW Edition - 10y $750
Prowler - 6mo $485
Prowler (Original Concept) (El) Poster / Model / CCC AVES Helmet LTI $479 (SALE)
Polaris IAE Edition - 10y $825
Polaris - LTI
Polaris (Original Concept) (El) Poster / Model LTI $849 (SALE)
Railen IAE Edition 10y $260
Ranger CV IAE Edition 10y $42
Ranger RC IAE Edition 10y $42
Ranger TR ILW Edition 10y $50
Ranger TR IAE Edition 10y $50
Sabre ILW Edition 10y $195
Sabre Comet ILW Edition 10y $205
San'tok.yāi IAE Edition 10y $260
Scorpius ILW Edition 10y $265
SRV ILW Edition 10y $170
Storm ILW Edition 10y $100
Talon Shrike 6mo $130
Terrapin ILW Edition 10y $250
Terrapin Showdown Edition 24mo $242
X1 Force IAE Edition 10y $60
X1 Force (Original Concept) (El) Poster / Model LTI $65 (SALE)
X1 Velocity IAE Edition - 10y $55
Vanguard Sentinel ILW Edition - 10y $305
Vanguard Warden ILW Edition - 10y $290
Vanguard Warden (Legacy Original Concept) Model / Poster LTI $299
Vanguard Battlefield Upgrade Kit Anniversary (El) (Sentinel) 3y $35(SALE)
Vulture ILW Edition - 10y $165
Vulcan (Original Concept) (El) - LTI $219(SALE)
325A (Original Concept) (JP) - LTI $100 (SALE)
325A ILW (customized wood / leather interior and loadout) - 10y $115
400i Citizencon 2951 Exclusive Preorder Meridian Edition - 6mo $289
400i Fortuna Edition - 6mo $290
600i Showdown Edition (Exploration Module + Name Reservation) 24mo $525
600i Touring Fortuna Edition (Fortuna Livery + 600i name reservation) 6mo $510
890j (Original Concept) (GY) Poster / Model / Revel & York Hangar / Name Reservation etc. LTI $1800
https://preview.redd.it/i3ymgsb59y3d1.jpg?width=3000&format=pjpg&auto=webp&s=9067d6f04b558011c583deec08cda55deadf9776
3. Unique Paints:
Title: Description: Total After Fees:
Ares Lovestruck pink - iridescent $20
Aurora Invictus Blue & Gold blue & gold $10
Aurora Dread Pirate (Unique Legacy) (El) black / skull & crossbones $30 (SALE)
Aurora Military Paint - UEE Distinguished Service Skin (Unique Legacy) (El) OD green / grey $30 (SALE)
Avenger Invictus Blue & Gold blue & gold $12
Avenger Solar Winds steel & Red $12
Buccaneer Ghoulish Green green - iridescent $10
C8 Pisces Code Blue (Limited Concierge Exclusive) blue, white - iridescent $10
C8 Pisces 2953 Auspicious Red (Rooster) Red & Gold $8
Caterpillar Ghoulish Green green - iridescent $15
Carrack 2953 Auspicious Red (Rooster) Red & Gold $25
Constellation 2952 Auspicious Red (Monkey) Red & Gold $15
Cutter Groundswell (Limited Concierge Exclusive) olive & orange $10
Cutter Nightfall (Limited Concierge Exclusive) dark steel & teal $10
Cutlass Ghoulish Green green - iridescent $10
Cutlass Black Skull & Crossbones black, skull & crossbones $15
Cyclone Invictus Blue & Gold blue & gold $10
Defender Harmony purple/blue/green/red- iridescent $15
Defender Platinum Platinum - iridescent $15
Dragonfly Ghoulish Green green - iridescent $10
Expanse Stardust (Limited Concierge Exclusive) blue & black $15
F7C MkI Corin (Limited) olive & red $15
F7C MkI Ironheart (Limited) silver & red $15
F7C MkI Killian Blue (Limited) blue $15
F7 MkII Ironscale (Limited Concierge Exclusive) Black & Rose Gold $15
Freelancer 2951 Auspicious Red (Ram) Red & Gold $15
Fortuna 2952 3 Paint Pack (MSR, 400i, 600i) dark green - iridescent $45
Fury Leatherback (Limited Concierge Exclusive) olive, red $10
Galaxy Protector (Limited Concierge Exclusive) steel blue & white $20
Ghoulish Green 4 Pack green - iridescent $32
Ghoulish Green 7 Pack green - iridescent $55
Gladius Invictus Blue & Gold blue & gold $15
Gladius Solar Winds charcoal & red $15
Hammerhead Fortuna dark green - iridescent $25
Hawk Invictus Blue & Gold blue & gold $15
Herald Ghoulish Green green - iridescent $12
Hercules Invictus Blue & Gold blue & gold $25
Hornet Invictus Blue & Gold blue & gold $15
Hoverquad Lovestruck pink - iridescent $10
Ironclad Dauntless (Limited Concierge Exclusive) silver & black $25
Legionnaire Shadow Strike (Limited Concierge Exclusive) black $15
Liberator Condor Paint (Limited Concierge Exclusive) white, grey $25
Lovestruck Pack (Ares, Nomad, Hoverquad) pink - iridescent $30
Lynx - Moonrise (Limited Concierge Exclusive) silver $10
Mercury Star Runner Fortuna dark green - iridescent $15
MPUV Firebrand (Limited Concierge Exclusive) burnt orange $10
Mule 3 Pack (Limited Concierge Exclusive) - $15
Mule Ghoulish Green green - iridescent $10
Nomad 29511 Auspicious Red (Ram) Red & Gold $15
Nomad Lovestruck pink - iridescent $15
Nox Harmony purple/blue/green/red- iridescent $10
Odyssey Windrider (Limited Concierge Exclusive) white & black $30
Prowler Harmony purple/blue/green/red- iridescent $25
Prowler Ocellus green/red- iridescent $25
Railen Hyaotan (Limited Concierge Exclusive) dark $30
Redeemer Fortuna dark green - iridescent $20
Reliant Invictus Blue & Gold blue & gold $10
Retaliator ILW 2950 Pack blue & gold $20
Sabre Raven Ashcloud (Limited Concierge Exclusive) black & Gold $15
Sabre 2952 Auspicious Red (Monkey) Red & Gold $15
Scorpius Tiburon (Limited Concierge Exclusive) flying tiger teeth $20
Solar Winds 3 Pack - $30
Spirit Allegiant (Fl) white, black, red stripe $5 (SALE)
Spirit 3 Pack (Limited Concierge Exclusive) - $35
Spirit Crimson (Limited Concierge Exclusive) red & white $20
Spirit Intrepid (Limited Concierge Exclusive) olive, white, orange $20
Spirit Olympia (Limited Concierge Exclusive) black, gold - textured $20
Storm - Summit (Limited Concierge Exclusive) grey, white $15
STV Blue Steel (Limited Concierge Exclusive) blue, black $10
Talon Harmony purple/blue/green/red- iridescent $15
Talon Ocellus green/red- iridescent $15
Ursa Respite (Limited Concierge Exclusive) Black $9
Vanguard Invictus Blue & Gold blue & gold $15
Vanguard Fortuna dark green - iridescent $15
Vanguard Solar Winds charcoal, red $15
Vulture Ghoulish Green green - iridescent $15
X1 Auspicious Red (Dragon) Red & Gold $8
X1 Auspicious Red (Dog) Red & Gold $8
You Got Our Backs Electro Skin Hull 2013 (JP) unknown $60
Zeus Mk. II Concierge Exclusive Solstice Black, Grey, Gold $20
100i Invictus Blue & Gold blue & gold $10
100i Auspicious Red (Dragon) Red & Gold $8
100i Auspicious Red (Dog) Red & Gold $8
400i Auspicious Red (Dragon) Red & Gold $15
400i Auspicious Red (Dog) Red & Gold $15
400i Fortuna dark green - iridescent $15
400i Meridian (Limited Edition CitizenCon) Dark Steel $30
400i Penumbra (Limited Concierge Exclusive) Black, Gold Trim $30
600i Auspicious Red (Dragon) Red & Gold $20
600i Auspicious Red (Dog) Red & Gold $20
600i Fortuna dark green - iridescent $20
2951 Auspicious Red Pack Ram (Freelancer, Nomad) Red & Gold $20
2953 Auspicious Red Pack Rooster (Carrack, Pisces) Red & Gold $25
2952 Auspicious Red Pack Monkey (Connie, Sabre) Red & Gold $25
2954 Auspicious Red 8 Pack - Dog & Dragon (X1, 100i, 400i, 600i) Red & Gold $65
https://preview.redd.it/9aza88769y3d1.jpg?width=3840&format=pjpg&auto=webp&s=6985fef1c0c161ff42531b9593f1216c7577fc12
4. Armors / Weapons / Other:
Title: Description: Total After Fees:
Advocacy Tools (JP) Faction 9 Baton, E&I M34 Restraint System $80 obo
Citizencon 2951 Digital Goodies Pack (JP) 2951 Trophy, Arden Balefire Armor Set, RRS Fallout Knife $40
Fieldsbury Dark Bear Helmets Choice of Pink / Brown / Orange / Green / Purple / Teal $6
Fieldsbury Dark Bear Sinister Pack (all six helmets) $30
Normal Subscriber Items Ask Ask
Plentiful Salvage Space Globe 2015 (JP) $10
Star Citizen Digital Novella 2013 $17
SQ42 Digital Manual 2013 - $20
Game Universe Map Digital Star Map $7
https://preview.redd.it/dcvwva289y3d1.png?width=1920&format=png&auto=webp&s=833de607d5643b0ca7e30dc9ee89639f582b925f
Subscriber items being sold at a loss (no markup for fees):
Title: Description: Total After Fees:
C2 Hercules Starlifter Plushie (SALE) $4
Mandible Snowfly Helmet (Fl) (SALE) $4
"Igniter" Lightning Co. Weapons Pack (Fl) Atzkaz sniper & Yubrev Pistol (SALE) $8
"Venom" Lightning Co. Weapons Pack (Fl) Atzkaz sniper & Yubrev Pistol (SALE) $8
Neoni "Tengubi" Helmet (Fl) (SALE) $4
"Venom" Lightning Co. Weapons Pack (El) Atzkaz sniper & Yubrev Pistol (SALE) $8
Avenger Copernicus Paint (El) (SALE) $5
100 Series Sand Wave Paint (El) (SALE) $5
Mandible Snowfly Helmet (El) (SALE) $4
Store Credit Sales:
These store credit sales are Middleman sales for clients, so prices are firm.
The price for each transaction is 60% of melt value + $20 per transaction to cover each giftable host ships. Transactions will be billed independently & limited to 1 per day to stay within CiG's $1000/day limit.
Available transactions are as follows:
Ship: Melt: Total After Fees:
Hammerhead (El) $725 $455
https://preview.redd.it/624njs5a9y3d1.jpg?width=1639&format=pjpg&auto=webp&s=eca2cda3619dd2906588618523a417efc3ee94c3
5. Accounts:
- 2014 Space Marshal Account with Unique MSR Night Runner Paint, OC Buyback Ships, Unique Limited Subscriber Items (Big Bennys Machine, 2946+ Trophies Etc.) & Legacy Backer Awards (Ask for more details.)
Details:
Currently liquidating store credit & buyback ships to lower the price.
Current price including everything prior to liquidation is $2230. After liquidating the excess on the account, price will be reduced to around $1400.
Notable Original Concept or Legacy buybacks: Polaris, Archimedes, Vulcan, F7C-M, X1 Force, Connie Phoenix 2015 anniversary edition, Endeavor Hope Class, Banu Defender, Hammerhead
-2013 Original Backer High Admiral Account with Original & Veterans Backer Reward, Free Hangar Fees Reward, RSI Class II Test Pilot Space Suit, OC buyback Ships & Legacy Alpha Packages, Unique Limited Subscriber & UEC Items from 2013-2014, F7A Mk II Upgrade & Legacy Backer Awards. Open to offers (Ask for more details.)
Details:
Currently liquidating giftables & rare game packages from the buyback.
Current price including everything is $3616. After liquidating excess from the account & buyback, price will be reduced to around $2300.
See ya round the Verse Citizen...Greetings fellow Citizens o7!
submitted by Slidebyte101 to Starcitizen_trades [link] [comments]


http://activeproperty.pl/