Neutrophils normal range

TransVoice: Share, Constructively Criticize, and Have fun!

2012.02.24 00:31 TransVoice: Share, Constructively Criticize, and Have fun!

A place to share your transgender vocal training related recordings for constructive criticism by the community
[link]


2009.09.21 19:21 Art History, from Prehistoric to Contemporary

This is a community of art enthusiasts interested in a vast range of movements, styles, media, and methodologies. Please feel free to share your favorite articles, essays, and discussions on artists and artworks.
[link]


2008.03.25 00:30 Reddit Parenting - For those with kids of any age!

/Parenting is the place to discuss the ins and out as well as ups and downs of child-rearing. From the early stages of pregnancy to when your teenagers are finally ready to leave the nest (even if they don't want to) we're here to help you through this crazy thing called parenting. You can get advice on potty training, talk about breastfeeding, discuss how to get your baby to sleep or ask if that one weird thing your kid does is normal.
[link]


2024.06.01 14:50 StronkWesker69 regarding Divine Judge Suspicion ability

regarding Divine Judge Suspicion ability
Some people might not get how this ability works despite how easy it sounds, simply keep your eye under your health bar, a judge icon is on cool down, if it changed then it will randomly show you one of the three colors, yellow and blue and cyan, if its CYAN then you must hit your opponent with a shadow ability, you can save your shadow bar if you successfully hit your opponent three times before you can execute an attack
If the icon is YELLOW then you must hit your opponent with a critical hit whether with a kick or a weapon attack, a breakthrough charge might not be available since the judge icon expires really fast so make sure you have synchronization perk ready, block breaking with a breakthrough charge won't count
If the icon is BLUE then you must hit your opponent with a normal physical attack, a kick, a weapon attack, a ranged attack, a throw or a block break with a breakthrough charge ready, remember it should be normal not critical so keep that in mind
submitted by StronkWesker69 to Shadowfight3 [link] [comments]


2024.06.01 14:47 BoomerSweetness Why i think the golden bloon suck, and what i would do to make it better (Long post)

While i think monkey team is a fine addition to the game, golden bloon fucking suck.

The problems
First thing, if you back out using the home button the gold bloon just dissapear entirely, i think it's because you can used on rounds where the golden bloon spawn so that you can prepare for it, but like who the hell care, and if i'm playing on impopable and bloons start leaking i have to use a continue because if i back out i'll loose the golden bloon, and this has been a problem for more than 3 years since the entire golden bloon existance, why has there been no fixes yet?
Second thing, it's so repetitive, unlike monkey team where you can try different strategies, there's only a few towers that work with golden bloon, and most of them require very specific setup which mean your choice of towers will be even more niche (you can only use short and medium range tower, and no tower near your glue/ice), i used to go with a 250 glue storm + bloon sabotage for golden bloon because since the ability is global range it let me pop the golden bloon without having to compromise my defenses, but now even that doesn't work anymore due to the slowdown nerf of glue storm, the only real other option that can consistently pop golden bloon and doesn't compromise having you to use specific tower is the pop and awe, which is extremely annoying to saveup for in the early rounds and also if you time the ability even slightly late the golden bloon will get past your defense. So in general, the way that the game want you to pop a golden bloon is to buy the exact same tower to pop the golden bloon and not let you buy global range/long range towers (which let's be real is usually the most fun tower to use)
Third thing, the golden bloon mechanic is just super annoying. Like why does it have to spawn on a random round? I don't think there'd be any issue if it spawn on a specific round like rounds that ends with 5, after all NK is trying to crack down RNG of many towers (which they should, because it's a tower defense game, loosing to rng is not fun at all) yet they still leave the golden bloon RNG to be a thing, there's a huge difference in difficulty if a golden bloon spawn in on round 21 or 29 for example. Same thing with the jumping mechanic, it's just so unpredictable that it makes the golden bloon unfun to play especially on harder maps like advanced or expert (i usually don't like playing on beginner or intermediate since it's too easy but on harder maps it become way too inconsistent).
And lastly, the rewards are just plain garbage. The only reason anyone would do this and keep up with the frustration is just to clear achievements. Like seriously i think it's much more efficient to just ignore golden bloon and play like normal to clear the map faster, and also the golden bloon isn't on all maps so it'll be even rarer for it to be conviniently on a map that you're grinding for
TLDR: Golden bloon doesn't let you back out of the game, is super repetitive and has very few strategies that work (while all of the good strategies being boring), has an overall rng and frustrating mechanic, and the rewards def needs a buff
What i would do to fix it: -Fix the back to home bug/feature, it's just an annoyance and absolutely do not need to be there -Either nerf the jump mechanic or rework it entirely, this mechanic just make it so that it's super annoying to deal with -Add some variety into the golden bloon gameplay, an idea is that every golden bloon game there'll be a special modifier in the golden bloon that get randomly selected every game (for example, the "golden MOAB" modifier, after round 80, the golden bloon will get transformed into the golden MOAB which mean you'll have to change the way you tackle the golden bloon since low tier ice monkey or glue gunner won't work on it anymore. Another modifier is regrow golden bloon that regenerate hp over time but have its invincibility and jump range reduced, which make tower like the glue gunner not as viable while making other fast attacking towers like the supermonkey viable) -Buff the reward, maybe give an extra insta monkey and the more golden bloon you pop, the higher rarity it will be (maybe only up to like tier 4, and a 5% chance to get tier 5 if you pop all golden bloons) -Reduce the RNG in golden bloon by making the spawn range smaller or remove it entirely and make the jump distance a fixed amount
submitted by BoomerSweetness to btd6 [link] [comments]


2024.06.01 14:40 Snarvid Jawhara Hate Thread

Jawhara Hate Thread
Like the other one, just for my hated enemy. Like, is there a bug leading to this statline or something? In any case, another breezy run followed by a Jawhara kill on the final day of Cycle 30.
Seems normal to me... is it the 6 figure health that's stressing you out, or the 975 speed on an AoE range 20 attacker?
Notice the Stasis on all 3 of them, as I had teleported them away with Astrostoicism already (as well as another one, which I killed with a Mura Mirror + Aurora Chant).
submitted by Snarvid to PathOfAchra [link] [comments]


2024.06.01 14:26 LoveScoutCEO Fool Proof Real Life Dating Plan For Guys Who Don't Want to Pay For Dating Apps, Singles Tours, or Matchmakers. This works! If you are in you twenties and looking to get married, have kids, join the PTA, and coach little league - this will work for almost anyone who tries.

So, I get MANY DMs from guys who want some coaching. That is fine. I enjoy it and I never can quite work myself up to charge for it.
I try to get them to tell me the basics: who old they are, where they are from, what they do for a living, roughly what their long term earning potential appears to be, and what their relationship goals are. I probably should flesh this out a little more because I really need to know how ambitious they are and how they handle conflict and disagreement.
There Is No One Size Fits All Approach
My basic advice varies by who the man is, what his life experience has been, and what his relationship goals actually are. It really depends on a lot of variables.
For very confident guys with a lot of international travel experience - particularly guys over 35 - I tend to encourage them to just go back. If they want to sign up for an app, so be it. But those guys can usually do well overseas without much help at all. A good many of these guys are small business owners or military vets.
Often I am not sure why these guys are even asking me for advice. They don't need any help. Many have been to Ukraine or the Philippines before and they had a great time then. They are going to succeed.
For guys over 35, particularly if they have a track record of difficulty with women or in many cases rarely or ever date, I usually encourage them to consider AFA. A lot of these guys are IT professionals, lawyers, doctors, CPAs, and sometimes even trust fund heirs.
Many are in the 1% for intelligence, but have some issues with social skills. One was worth maybe $20 billion dollars. I didn't figure that out right away, because he did not tell anyone he was ridiculously wealthy.
Anyhow, these guys are the ones I have absolutely no problems urging to take an AFA tour. Meet some interesting women in real life. Many of these guys never meet a single woman in real life. A lot of them work mostly alone or in all male settings. A few - particularly those working in Silicon Valley - don't speak to women at their workplace, which was the traditional place that men like this met their mates starting in the 1970s, because HR has terrified them with sexual harassment "training."
For them an AFA tour often seems like a miracle, and I get a lot of positive feedback from them, but this leaves out at least half of the guys I who reach out to me.
The Other Guys
The other guys are more challenging to help. Sometimes going overseas is a real challenge. Sometimes guys are young and I rarely encourage guys under thirty to start international dating. Many do and I know it can and does work out sometimes, but sometimes it is more of a challenge.
Sometimes guys simply are not financially able to do international dating. It does take a certain amount of upfront capital - at least a few thousand dollars for a plane ticket and a week in a hotel somewhere.
Occasionally, I get the vibes that a guy is not emotionally ready for international dating or in fact ANY dating. I have had guys reach out to me a week after their wife moved out and left them nothing but a letter from her attorney saying she was filing for divorce. They are not in the right place in life.
A very few guys seem unbalanced and I try to encourage them to get some professional help before they start dating. At the very least read some self-help books about handling stress, relationships, and so on. This is pretty rare, but there are guys who are not ready for a relationship with a dog much less a woman.
What Do I Recommend?
For normal guys, particularly normal guys following a regular middle class career path, living in an American city that is at least a couple of hundred thousand people are larger who are under 35 here is my approach.
First, consider taking a college class. There are always more women than men in college classes these days, especially if you opt for a class that historically has higher female participation. Art history, psychology, and sociology are all great ideas. You will meet women - generally younger women - some of whom will be single.
Second, join a cross fit gym or something similar. Headphones have basically destroyed regular gyms as a social place. No one talks to anyone. That sucks and is a big part of the rise of cross fit because they are very social. Plus, you will get in shape and that is a positive.
Third, and this is probably the best advice. Start going to church. Look, in the United States in most mid-sized cities there are a huge range of churches. If you are conservative this should be incredibly easy, because the US is awash in conservative churches, but there are other choices.
Most large cities have some very accepting denominations like Unitarians and certain varieties of Lutheran, Episcopal, Presbyterian, and others. Heck, the Methodist church just blew itself apart over how conservative it should be. Just start going to churches until you find one that you find OK or better. If it helps you focus on you goals and improving how you treat others it is probably going to help you grow as a person.
Church Singles Groups
They will usually help you improve your dating life, because almost every church will help you meet single women. In the smallest church it might not be formalized, but most churches that are moderately sized have singles groups which normally have FAR more single women than single men. But don't trust me about it. Here is a whole article that explains the gender imbalance in churches.
I am focusing on Christian groups because that is what I know best, but this advice probably holds true in other religions too. I know it was true in the large synagogues in Los Angeles at one time, because I had one Jewish buddy who raved about it and I believe met his wife through a synagogue group.
So, your chances of success are very high. Essentially, the people who lead these groups are matchmakers and like all good matchmakers most of them are very invested in helping their members succeed.
And it works for a ton of people. Here is a longer article by some relationship coach about how to find a church singles group that works for you. It has some interesting information, but there are also large Christian dating groups not tied to one church - especially if you live in a large metro area. This article is tied to someone trying to earn a buck, but I have nothing against that, and it looks like a pretty good resource too.
Change Is Hard
So, I know there are some guys who have million reasons why this won't work for them before they have read down to here. Maybe it won't. Who knows? But if you want to change your life you have to take risks.
Change is hard and uncomfortable. But each of these suggestions do work regularly for certain singles. Why not you?

submitted by LoveScoutCEO to MailOrderBrideFacts [link] [comments]


2024.06.01 14:25 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to computervision [link] [comments]


2024.06.01 14:24 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to learnmachinelearning [link] [comments]


2024.06.01 14:23 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to deeplearning [link] [comments]


2024.06.01 14:21 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: https://github.com/tensorflow/addons/issues/2807 warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66s 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to tensorflow [link] [comments]


2024.06.01 14:15 cashlezz This game's combat is wasted on Tower and suggestion to improve it

Wuthering Waves' main selling point is its very fluid and high skill ceiling combat. It's best showcased in skill based boss fights or high lvl enemies that really allow you to make use of its parry/perfect dodge mechanic and a lot of skill expression. The hologram fights and red overworld enemies are some of the best combat on mobile, or pc even by far.
Then come the main endgame mode, Tower of Adversity, and 90% of it is timed swarm challenges where hoards of mobs come at you that don't really require any skill to beat. All it is is a dps check. There are the occasional solo bosses, but most of the time it's just hoards of hp sponge trash mobs. You don't even need to dodge properly or parry. Just spam skill ults, rotate, spam skill ult, rotate.
The mode copies Genshin so much that it loses Wuwa very unique identity and is a further misopportunity for players to engage with the intricate combat. While DPS checks are easy way to gate progression and compell players to keep upgrading their characters, it is done so poorly with boring hp sponge mobs that offer zero anticipation or excitement. Whenever you clear a stage, it feels like you've managed to finish a chore, rather than attaining a sense of triumph. This is also why I stopped doing abysss on Genshin and eventually quit. Because,at some point, it's just not fun anymore playing Dynasty Warrior simulator
Instead, I propose a change to to tower where DPS checks instead become more skill based. That you have to play well to earn the current bonus.
I propose a system where you:
  1. Stop the in game clock, or even gain time, for every instance you perform a perfect dodge or parry.
  2. Breaking the enemy's vibration bar would also slow the timer during its duration as well.
  3. There should also be a diverse range of 3 star clear requirement aside from just time, ranging from not getting hit over x times and parry over x times for boss stages.
  4. Furthermore, I propose that normal enemy hp be reduced, but aggro be increased to allow for a more intense experience and more opportunities to make use of the parry and dodge mechanics. this would also help with mobs feeling less like pointless hp sponges.
This way, players who play really well, but are just lacking that little bit of stat can still scrape by through skill, while whales who are already steamrolling can still benefit by incentivising them to actually play well, instead of just doing ult spam like Genshin.
submitted by cashlezz to WutheringWaves [link] [comments]


2024.06.01 14:11 Crazy_Shape_4730 What are your go-to guns?

I feel like I should experiment more instead of just using the obvious ones, but also there are plenty I just haven't discovered.
I mainly use 1899 handgun as a primary, sometimes paired with a Mauser pistol for effective dual wielding (thought about getting a second 1899 for that), if it's less about effectiveness and more about looking cool, I use a pimped out Schofield revolver.
Litchfield repeater - seems like the best one but I've been using it forever.
Double barrel for shotty
Rolling block for long range
Additional loadout obviously bow, varmint rifle, and often a sawed off shotgun
Sometimes I'll throw a dynamite or fire bottle, but mainly if a big party of bounty hunters is very close but hasn't seen me yet, or just to burn people for fun.
The lasso is obv fun and can be used to sweep someone away if you don't wanna have to kill them but they're shooting at you.
On the other end of the spectrum, it's fun to hack away at unarmed innocents with a machete or any other bladed weapon
Ammo: Mostly express, only really switch to high velocity or normal when I'm running low. Split point when using deadeye (only found out split point uses less deadeye after I finished the epilogue). Almost never explosive ammo (seems like a hassle finding fat to craft). For shottys I use normal - I find the idea of a cluster of buckshot ripping people apart quite satisfying
submitted by Crazy_Shape_4730 to RDR2 [link] [comments]


2024.06.01 14:10 pippin0108 Has anyone had treatment just for high testosterone?

I’ve been to the doctors many times for PCOS-related queries but have been dismissed and told everything is “normal”.
I am 31 and periods have been off all my life (cycles vary between 35-100 days), I have progressively worse hirsutism and a constant sever bloat and fatigue.
Since I have had various ultra sounds which show no cysts, I’m not overweight and I very thankfully had no issues getting pregnant (I must ovulate but just not often) doctors have said I have no hormonal issues.
I went to the doctor again in October 2023 and had a blood panel, because I noticed after I gave birth that my symptoms noticeably went away and then returned as soon as I stopped breastfeeding. But my blood panel all came back normal.
However, I was just looking up my actual results (as my hair growth, bloat and fatigue are out of control now, this is 2.5 years PP) and although I was in the “normal” range in the UK for serum testosterone, I’m actually just out of the normal range and quite far out of the “ideal” range for women in the US and other countries.
So I think I might have high testosterone levels, or at least on the very high side of normal, and this would explain a lot if my symptoms subsided when breastfeeding as that lowers testosterone.
They are 2.4mol/L, if that helps or if anyone can relate.
Is this an issue I could bring to my doctor and try to fix? I have never been on birth control before and am wondering if this could be a solution / has worked for people with PCOS or hormone imbalance.
submitted by pippin0108 to PCOS [link] [comments]


2024.06.01 14:04 pepezdejvic How tf do you score in this game?

I have been playing this game for 4 days now and i am finally kinda used to the controls.. it was frustrating because i am a fifa player and this control system nhl has is completely different to me, so i was happy i finally somewhat got used to it, but now i am just getting “bored” (more like frustrated..) of the game mainly because i just can’t score.. i know it’s a skill issue but it’s just so frustrating .. i am shooting so much but it’s just impossible for me to score.. i shoot only when the target in a goal is green but still it just doesn’t end up in a goal… like how do you score?? Is this normal? I have never scored a goal from a solo action 1v1 with goalie, long range shots are absolutely impossible and even the close ones.. how the hell am i supposed to score, he always catches it.. do you have any tips? please help
Btw. I play on the lowest difficulity possible (beginner one.. not even semi pro).. i expected it to be easier on this difficulty to score.. in fifa it’s actually hard for beginner not to score on this difficulty but here it’s actually impossible for me
submitted by pepezdejvic to EA_NHL [link] [comments]


2024.06.01 14:03 AdditionalWar8759 Rachel Goes Rogue Podcast: Episode from June 1st, “Chapter 28: Going Rogue Isn’t Easy”

***ads play and podcast starts at 1:47
Intro (Timestamp: 1:47) - Rachel: Welcome back to another episode of Rachel Goes Rogue. This is your host, Rachel Savannah Leviss. Today, we are talking about part three of the Vanderpump Rules reunion. - Rachel: It has finally come to an end, season 11. It's been a long time coming, and we're here to react. I have my producers with me, and as usual, they will be asking me some questions to get my perspective on what we just watched during the reunion.
Well, first of all, I want to start off with asking you just your overall thoughts on the reunion, watching it. How do you feel? (Timestamp: 2:19) - Rachel: Overall, I just feel tired at this point. I don't enjoy watching this show, and (Rachel starts to get emotional) I'm just happy that it's over. It was good that they didn't talk about me very much this last episode, part three. - Rachel: That's great, but it's been really difficult watching each week. And I feel like I can finally start to move on from all of this, because it's been really difficult. It was really heavy and sad. - Rachel: And I think everyone on that cast is struggling. And I would be too if I was there. I mean, I'm struggling just watching it from the sidelines, so I can only imagine what it's like being on that stage.
So you're getting really emotional right now. Where is this emotion coming from? (Timestamp: 3:28) - Rachel: It's coming from a place of feeling like I haven't had much room to go. Feeling like stuck between a rock and a hard place, so to speak. Because this entire time, I have been preparing for them to slander my name, to paint me in the worst light. - Rachel: And my goal with this podcast was to be able to represent myself, to defend myself, to share what I've learned through my time that I took away and my recovery, and just to shed more light on the situation. - Rachel: And it hasn't been easy. It's been an extreme rollercoaster of emotions in a lot of different phases, getting sucked back into it, and then feeling like all consumed by all the comments and everything, and then completely cutting off communication with the outside world and living in my own reality in the moment. It's all about that balance, and it has not been easy to move on. - Rachel: I don't think it's been easy for any of the cast to move on rehashing it and talking about it and having other people tune in. It's not typical. It's not normal. And the day has finally come that the show, season 11, is over, and it's a relief to me because I don't have to keep waiting for the other shoe to drop. - Rachel: I don't have to think about what lies they're going to spread about me, and I don't have to think about what I need to defend myself about. And then following week, I feel like I can finally start to live my life again.
And so you're kind of talking about the boundaries that you've been setting by staying away and cutting people off, which obviously boundaries was a really big topic at the reunion. You obviously set some really strong ones by not returning to the show. What's your take on this discussion of boundaries? Do you agree with Lala or do you side more with Ariana when it comes to boundaries when it's in regard to filming the show? (Timestamp: 5:40) - Rachel: I could see both of their points of view. Setting a boundary for yourself is not an easy thing to do. And when other people are upset that you set a boundary for yourself, that's usually a telltale sign that that person is using you in some way and is not happy that you have this new boundary because it's not serving them. - Rachel: So, I can see why Ariana upholded her boundaries by not speaking to Tom, even though she actually did film with Tom this whole season, or for the later part anyway. But she refused to have that conversation with Tom at the end of the show, and I commend her for it because it would have been a fake conversation. You could tell that Tom, his only motive to having that conversation with her is for camera purposes and storyline purposes. - Rachel: Therefore, it's not an authentic conversation. It would have been crocodile tears, the whole thing. And I completely understand Ariana walking away. I walked away too, and people weren't happy about that either. - Rachel: For Lala's point of view, I can understand her perspective in wanting to have a good TV show for her livelihood and the longevity of her career. If you're going to commit to filming, then I can see why Lala is upset, because you are not only committing to filming with this person, I can see her point in that she is living under the same roof as Tom. - Rachel: They're living together, they're filming together, yet in Lala's eyes, Ariana is being stubborn by not filming with Tom, or that one scene. Who even cares about that one scene? I don't know. - Rachel: It's all so silly to me, but boundaries are important. I was in a place where I didn't have boundaries, and I was really trying to appease production and put on a good show. That became my priority season 10.
And where do you think the line needs to be drawn, you know? When at the end of the day, this is a paycheck and this is a job, versus this is someone's real life. You've talked a lot about wanting to live in reality. Where do you think that line should be drawn? (Timestamp: 8:32) - Rachel: I think that's an impossible question to answer when you're filming a reality TV show, because the line is so blurry, it's impossible to know what's real and what's not. And the more I'm out of it, the clearer I can see that. We see it with Tom Sandoval when he talked about production. - Rachel: He did the New York Times article, and he stopped talking mid sentence when a plane flew over or a truck drove by, whatever it was, because the audio, typically when we're filming a show and a plane flies by, you stop talking so that the audio can pick up normally without the distraction in the background. - Rachel: So it's like programmed in your mind to think a certain way, to act a certain way, to talk a certain way, to pursue certain things, where it becomes a part of your patterning. We also see the lines get blurred with Scheana and the comment section, and what is real life and what is not, what is her own true motivation for doing certain things, and what is influenced by outside commentary. - Rachel: That gets so blurry, and when you're all consumed in the perception of yourself, how can you really be sure that you're operating from a place of an inner knowing? That's a boundary that's blurred. With Lala, she clearly prioritizes the success of the show because she wants to secure her paycheck, and when people are setting boundaries for themselves and it's conflicting with what she wants and what is successful in her eyes, that sparks an anger within her. - Rachel: And it's all fabricated to a certain point because the bottom line is this show. So, I think it truly is impossible to live a real life and be on a reality TV show.
So, do you think it's fair for Lala to direct that anger towards Ariana? Or do you think she should be directing it more towards the show? (Timestamp: 11:12) - Rachel: Oh, no, not at all. I don't think that it's fair that Lala is directing that anger towards Ariana because Ariana has been very clear with her boundaries since the very beginning and…
I guess if she's feeling this way, do you think maybe she should have upheld her boundaries more if she was feeling so resentful towards someone doing the same? Do you think she's feeling like she regrets things that she had said in the past? (Timestamp: 11:35) - Rachel: I think she did uphold her boundaries. I think that she feels like she hasn't been supported the same way that Ariana is being supported. And it's probably not a good feeling, but she maneuvered differently than Ariana has. And Lala doesn't extend the same empathy towards others. So it's harder to support her, I believe.
She does make a point to say, many times, that she feels like things are not being honest on camera. She points out Tom and Ariana’s relationship being one of those things. Katie has a flashback moment where she also calls it out. Do you agree that things are not always honest on camera? (Timestamp: 12:12) - Rachel: Totally. Yeah. I think the point that Lala is making is that Tom and Ariana haven't been honest about their relationship on camera. - Rachel: And I think people are getting caught up in Lala being hypocritical because she wasn't honest about her relationship with Randall. Okay, yes, that might be true. But the point is that Tom and Ariana haven't been good for quite some time. - Rachel: And their relationship that was portrayed on camera for fans to see was not an accurate representation of their relationship. I see the frustration because I agree with that too.
Even on your part, how does it affect you as someone on the show when people aren't fully honest on camera? How does that affect the rest of the cast? (Timestamp: 13:21) - Rachel: Yeah, it affects everyone when people aren't fully honest on the show. I mean, I wasn't fully honest the season 10 reunion. I was still covering up for Tom Schwartz. - Rachel: I was still covering up for Tom Sandoval. I was still going along with that narrative, and it would have been much better to just be open and honest about it. But of course, Tom was like, no, that wouldn't be good for business. - Rachel: It wouldn't be good for Schwartz and Sandys if people knew that the Schwartz kiss wasn't authentic and we need that to seem real. So it does affect everyone when you're not being honest, because it portrays a certain picture that isn't reality, and the whole point of reality TV supposedly is to be real, following these real people's lives. - Rachel: So honesty would be like the most important value characteristic you would think that everyone on this show should have. But it seems like nobody does.
Well, speaking of honesty, Ariana kind of called out Tom and his motives behind wanting to apologize on camera. He finally does get that moment during the reunion to apologize to Ariana. He has some words when he does, he calls the affair something he regrets every day. He says that he wears it like a badge of shame. On your end, how did that feel watching that? (Timestamp: 14:46) - Rachel: It's hard to tell if Tom is being honest or not. Even in the Secrets Revealed episode, when he was asked how many girls he had sex with since me, and he had to pause and think about if he was going to be honest or not, he's just been caught in so many lies that it's hard to tell if he's being truthful. - Rachel: But hearing Tom say that he regrets getting involved with me every single day, I regret it too, so it is a little bit painful, but it's also like maybe something is registering for him. - Rachel: I don't know. But then again, his actions speak a lot louder than his words. He knows what words to say, and then it seems that he fails to follow through with meaningful action. And that's where true amends come into play.
There was just, I feel like, a lot of pain in the room all around. You kind of acknowledged that at the beginning of this episode. What do you think that this pain, and even Lala saying that she was okay seeing some of those friendships end, what do you think that means for the future of this group? (Timestamp: 16:07) - Rachel: I don't see much of the future for this group. It looks pretty shattered. It looks like these friendships are not healthy friendships. - Rachel: The dynamic between Lala and Scheana is not a healthy dynamic. It seems to be like a power imbalance. It seems like Scheana is trying to appease Lala to make sure she's secure, and she's getting certain needs met in that friendship because Ariana hasn't been around for Scheana the way that she's used to. - Rachel: Yeah, you could tell that Scheana’s struggling with coping with that. It seems like Lala's really on a wavelength of not effing with anybody on the cast right now. It seems like her friendship with Katie isn't strong because Katie's gotten really close with Ariana. - Rachel: It seems like even her friendship with Scheana is a little rocky. I think she sees Scheana as someone that's not...How do I want to say this? - Rachel: And I hate saying this word, because I don't want to like categorize somebody as something, especially because I've been called this before too. But I think seeing how Lala reacted to everything and how Scheana was trying to be the fixer and appease Lala, and it just didn't seem like enough for Lala. I think Lala sees Scheana as someone who is weak, perceived weakness. - Rachel: I'm not saying that Scheana is weak. And I think that there's a lot of alliances and manipulation happening, and none of that is healthy for our friendship dynamic. I can see why the show is taking a hiatus, because it just seems so fractured
Well, it definitely seems like at the very end of the episode, Scheana was very sure to get that last word in. I felt like she was looking directly at Lala and almost begging for her to hear her out that she was on her side. And it really did seem like the very end, Scheana had to choose. Is she Team Ariana or Team Lala? Do you think she made the right choice? Do you think she needed to make a choice, or do you think that she's putting this pressure on herself? (Timestamp: 18:21) - Rachel: Ooh, that's a good question. I think she feels a lot of pressure from the outside perspective, and she doesn't want to, obviously, like burn bridges with Ariana or anything. And I think Ariana has been very gracious towards Scheana. Do I think that she needed to choose sides? I don't think so. I don't know. - Rachel: I can see Lala's frustration probably because I'm sure Sheena and Lala have had conversations about the whole situation. And without Ariana there, I'm sure Sheena's singing a much different tune than what we're hearing at the reunion, and that's sparking some frustration in Lala. And I'm sure that was a similar feeling when she called out Katie about it too. - Rachel: So yeah, I think that Lala feels pretty isolated, I want to say, in her feelings. And now that it's aired, and I did check Reddit for the first time in a very, very long time, it seems like the majority of people are hating on Lala right now. I'm human. - Rachel: I do hold some resentment towards Lala for the way that she's treated me over the years. I do empathize with her a little bit because all the hate online is just a little bit ridiculous. And I think also people are afraid to speak a differing opinion than the team Ariana side because people are just ruthless online and they don't want to hear a differing opinion. - Rachel: And if you do, then you get shunned out, too. It's very, my therapist calls it tribal shaming, where if you're not following the rules of the tribe, spoken or unspoken, then you're cast out and you're shunned.
***ads play and podcast resumes at 23:24
I mean, it does feel like the fans have had more of an impact on this season than ever. Would you agree with that? (Timestamp: 23:24) - Rachel: Yeah, especially because as they were filming this show, the fans were boots on the ground. We're going to production, we're going to filming, and we're going to take photos and document what we saw and all that stuff. Like it was very interactive in a way. - Rachel: I think with after show this year, it was a little bit different because some things have changed since the ending of filming last summer. One of the things was me starting my own podcast and speaking freely about my experience and my opinion and the after show gave the cast an opportunity to rebut what I was saying and it provided more of a context. - Rachel: And I think with more time passing from the end of filming last summer to, you know, early January, February of this year, when they filmed the after shows, cast dynamics shifted because as we all know, now watching the finale, Lala and Ariana did not end on a good note whatsoever. - Rachel: And so, you know, she had some choice of words to say during the after shows. And it seemed like she really got Sheena to support her with that.
Speaking about the fracturing of this cast, something about her did recently open. Not many cast members were in attendance to this opening. What's your take on that? (Timestamp: 24:56) - Rachel: Interesting. Do you know who went? - iHeart Lady: I know Schwartz went - Rachel: It seems a little telling that maybe Sheena and Lala aren't on the best terms with Ariana right now, because they went to like the Broadway opening that Ariana did for Chicago. And they also went to Dancing with the Stars. But this is all before they knew that she didn't watch the show. And so that was all before the reunion and everything. So yeah, it seems like maybe they're not on the best of terms right now.
What are your thoughts on production holding the last five minutes until the reunion to show to everyone? (Timestamp: 25:47) - Rachel: I wonder if they got word that Ariana wasn't watching the season. And they did that as a way to ensure that they would get a reaction from her, kind of like forcing her hand a little bit, forcing her into a situation that she did not want to be in. It was very strategic in that way. And it was something new. Like, we've never done that before. It was creative, for sure, on production's part.
Do you think it was fair to Ariana? (Timestamp: 26:27) - Rachel: There's a commitment, and part of that is watching the show and having an opinion on what's happening besides your own story that you're sharing. So in a way, it's like ensuring that Ariana did have an opinion on it. So very eye opening, to say the least.
I want your take on Tom's final words. He says, I love it. It's good for me. A lot of people in the room were very shocked by that. Tom even has a reaction to it, where he shakes his head no. They didn't even really press him on what he meant by that either. What's your take on all of that? (Timestamp: 26:49) - Rachel: I wish they pressed him on what he meant by that a little bit more. And Ariana was pretty much the only person that called him out on it too. She caught it. - Rachel: She was like, that exactly proves my point, that you are doing things for the audience, for the production value, and for his own story purposes. I guess in Tom's eyes, having Ariana refuse to film and walk off was good for him because he felt like he completed his job and fulfilled his duty with what production was asking from him. And Ariana was not. - Rachel: And I think selfishly, he probably thought that it would give him a better chance of having more of a redemption story. - Rachel: Because, ultimately, production is the one picking and choosing what they're going to share on the show and edit and put certain music behind certain scenes to make it seem even more of a certain way. Tom knows how to play into that. But I would have loved to hear what his explanation for that comment would be.
Why do you think they didn't press him? (Timestamp: 28:34) - Rachel: I think that they're protecting him, like they always have been.
We did see something interesting at the very end with Lisa stepping up and taking Ariana's side, which is kind of a different tune. You've talked about this before, where she seems to protect the guys a lot of the time, but then she changes her tune at the very end of the episode and takes Ariana's side. What are your thoughts on that? (Timestamp: 28:39) - Rachel: I think Lisa is very strategic with what she puts out there as well. And she knows what people are saying about her, with her always supporting the guys. So that could have been a motivation behind her changing her tune and supporting Ariana in that way. Yeah, I don't know. It's hard because I think also Lisa is very aware of who the fan favorites are. It's her show. - Rachel: She's an executive producer on this show. So she's not a dummy when it's coming to that. I think it helps her if she is supporting Ariana because she'll praise Ariana for walking away and end up holding her boundaries. - Rachel: But then when it comes to me, I don't even remember what she said about me. But when it comes to me walking away and setting a boundary for myself, I've been told that I'm a coward and I'm running away from my problems. - Rachel: So that part for me gets a little frustrating because it's like, and also the fans praising Ariana for upholding her boundaries and walking away and supporting her and telling her like, you know, she's outgrown this show. - Rachel: She should move on and do something even better with her life. And she's finding out now that these aren't her true friends and like good for her for upholding her boundaries and walking away from this situation. And I've done the same thing and it has been met with scrutiny.
Lala compares her situation with Randall to Ariana a lot throughout this reunion. Do you think the two are similar at all? (Timestamp: 30:37) - Rachel: I don't think that the relationship that Lala had with Randall is comparable to the situation that Tom and Ariana were in. It's hard to get on Lala's side with some of the things that she's saying, because the way that she spoke about her relationship with Randall is like bragging about doing BJs for PJs and getting gifted a Range Rover very early in their relationship and not being honest about who she was seeing and the situation that was happening basically. And it just seemed like she was in it for the money and like to secure her success and fame. - Rachel: So it's hard to get behind that, especially when she's been so outright about it. Unfortunately, Randall wasn't the stand up guy that she was selling him to be. We weren't buying it. - Rachel: In Ariana's case, viewers got to see that relationship develop over the years, whereas with Lala's, he wasn't around, like it was secret for a while. And, you know, it's harder to develop feelings towards a person or a relationship when you're not seeing it play out on camera. I think Lala has a lot of anger, maybe even towards herself, for the situation that she allowed herself to be in. And I think she might be taking that out on Ariana.
How hard is it to be really honest when you're in this position? And do you think certain cast members have an easier time doing this? (Timestamp: 32:22) - Rachel: So this is like where your own values come in. Like, are you an honest person or are you not? Because there are people in this cast that are not, and we know who they are, and they have no problem lying, and it doesn't bother them when they lie. - Rachel: And for me, I'm working towards living a more authentic, honest life. And part of that is being honest with my emotions, thoughts, and feelings, and expressing that, and doing that in a way that is still respectful, because I'm not trying to hurt people in the process. And I am trying to express myself honestly and be true to myself. - Rachel: So I think it just depends on who you're asking. I mean, it's definitely not easy. It's definitely hard because you're on this platform, this public arena where you're opening yourself up to scrutiny. - Rachel: And if other people have differing opinions than you do, or if your opinion is the minority, you're basically going to be harassed and scrutinized. And so sometimes for people, it's easier to not be fully honest with their thoughts and feelings in order to save face or in order to go with more popular opinion because it's perceived to be safer that way. But I don't know. - Rachel: At this point, it's like your words aren't going to hurt me. You can say whatever you want to say about me online, and I've survived this far. So whatever else you say about me is not going to affect me any more than it already has. - Rachel: I've developed thick skin through this process, and I've come to the point where I value my friendships that are real in the sense of I interact with these people in real life. I care more about people's perception of me when they actually meet me and interact with me and the vibes I give off that way. So you get to a certain point where it's almost your duty to show up for yourself and be honest with how you feel and how you think about a certain thing in that moment. - Rachel: And your opinions can change with time too and with more information. It's not like I'm going to say this one thing and I'm always going to feel this way. It's always changing, it's always developing, we're always getting more information, and we're always experiencing new things that change our perspective on life. - Rachel: So it's just your duty to represent yourself in the most authentic way so that your people will find you.
***ads play and podcast resumes at 38:08
Well, I think there was one kind of shining moment, I'll say, even though it was a really emotional moment. But the moment between, and this is a little bit of a pivot, but the moment between Schwartz and Katie, I found really interesting, where Andy was asking about their relationship. It seemed like this season, they had a little bit more of a playful dynamic. But Schwartz gets really emotional, saying that he doesn't regret how their relationship ended. But you can kind of see in his eyes that he tears well up. He gets really emotional. What did you make of that moment? (Timestamp: 38:08) - Rachel: We don't think we've really seen a moment like that between Tom, Schwartz, and Katie. It really seems like they've come to terms with how the relationship ended, and that it was for the best. But it seemed like there was a lot of fond memories and just appreciation for one another, that I don't think I've really seen that dynamic between them before. - iHeart Lady: To me, it seemed like in a season where there was a lot of hurt, that seemed like the one moment of maybe seeing two people that are going through the process of healing. - Rachel: Viewing that, it did seem like they were both coming from a place of healing, because they weren't throwing insults at each other or trying to bring each other down. It was very respecting one another and appreciating the moments that they did have together while it lasted. And that's refreshing to see on this show.
Lala said something at the very end where she said it was really hard for her to show up to season nine reunion, I believe it was. You know, she didn't want to talk about certain things, but she showed up. Ariana said the same thing where she could say the same about the season 10 reunion. She didn't want to be there. You could probably say the same thing about the season 10 reunion. You didn't want to be there as well. Is it fair to say everyone's been in a position where they didn't want to be somewhere, but they did anyway? (Timestamp: 39:44) - Rachel: 100%. Yeah, totally. And that's like the part of committing to this show. It's a commitment. And even though you don't quite know what you're signing up for, you know that it's not going to be necessarily easy. And there's a challenge in that. - Rachel: And I think, just speaking for myself, there was an opportunity for growth for me in that. Yeah, I think we've all been in a situation where we didn't want to show up for something and felt, I don't think obligated is the right word, but we made a commitment to being there, and we followed through with our commitment. And it's hard.
You started this episode off by acknowledging that there was a lot of healing that this cast needs to do. As someone who has taken a step back from filming, you've had this time to kind of come back to your own reality. What can this cast expect when you have that moment to kind of breathe and have that separation and you rejoin reality for a minute? (Timestamp: 41:07) - Rachel: Oh, okay. That is a loaded question. Because I think that there's a little bit of fear with not being the current topic of conversation. - Rachel: I think addiction is the wrong word, but there's a little bit of the dopamine hits that you get when you're being talked about on a reality TV show and the fear of that going away permanently could be a scary thing. But taking time off and re-centering with yourself, I think is like the best thing for this cast right now, because we don't want to be forced into situations that we don't want to be in. That's not living an authentic life. - Rachel: I mean, I've been worrying about scenes and storylines, and I haven't even been a part of this show, but now it feels good not to worry about that. And I do have to say, just like reading all the comments on Reddit right now, it's like hardly anybody is talking about me, which is a great feeling. It's just so much more freeing when you're not living your life for somebody else's entertainment anymore. - Rachel: It just feels like you get your life back a little bit. It's so complex, and I think it's hard to understand if you haven't been through being on a TV show for millions of people to comment on and judge your life. I don't think humans are meant for that, and there's no way that that's healthy. - Rachel: Yeah, I said that I think the cast, we have a lot of healing to do. We, as in, I still do too, and part of that is coming back to reality. And I really don't think that we've had a minute this whole season. I think it's going to be good for everyone.
Has this year though felt different to you? I feel like you're like half in, half out (Timestamp: 43:42) - Rachel: Oh, yeah, it's felt so different. But I think like a large part of that has to do with going to the meadows and really reconnecting with myself and learning about my issues and how it was showing up for me and really coming to terms with like, what is this piece of external validation and how is that motivating me? And is it even real? - Rachel: And just like really re centering back into myself and gaining a lot more perspective with that. Without the meadows, I would not be where I am right now. There's no freaking way. So it is. I'm living a new life. I really am. - Rachel: And I feel like I haven't really been able to truly have the opportunity to live my new life to the fullest because this show has been holding me back. And I know that that's partially my fault too because I'm indulging and speaking about it, but I'm really looking forward to the days when I can truly move forward and evolve into something even more magnificent.
Outro (Timestamp: 45:02) - Rachel: Thank you so much for listening to Rachel Goes Rogue. Follow us on Instagram and TikTok for exclusive video content at Rachel Goes Rogue Podcast.
***end
submitted by AdditionalWar8759 to vanderpumprules [link] [comments]


2024.06.01 14:02 Born-Animator749 Doctor wont help me

I have had 3 tests -
LH 5.4, 1.4, 1.8 IU/L FSH 2.7, 2.5, 2.8 IU/L SHBG 15, 11 nmol/L Prolactin 269 miU/L Testosterone 6.5, 14, 8.7 /L Free testosterone 0.388 miU/L
Takeaways - my free test seems in range but test is all over the place. My shbg is extremely low, LH also seems low.
I had an MRI to rule out any growth in the pituaitary gland, the results came back fine.
I asked the doctor what else we can investigste because this is not normal for a healthy 24 year old and he said that sometimes these things happen and that he wont investigate anymore.
Context, i have 0 libido, csnt get hard without stimulation, even then the erections are weak. I hsve to tske cialis in order to perform. I stopped taking 50mg sertraline around 1year ago. My ability too orgasm ismucuh better but libido is down.
Just want to know the cause. Im healthy and 24, im supposed to be in my prime.
submitted by Born-Animator749 to Testosterone [link] [comments]


2024.06.01 13:57 ellaly Abrupt, significant rise in AMH 2.4 -> 7.6

Abrupt, significant rise in AMH 2.4 -> 7.6 37yo f, normal bmi.
15 Mo ago, I did an online mail away AMH blood test. I was on oral contraceptive. Result was 2.4, per their reference range and my obgyn comment, "normal".
Last week, work up for egg freeze. I had ultrasound with antral follicle count of 34. I got AMH results yesterday at 7.6! I've been off OC for about 11 months. Cycle length 30-32 days. I have no discernable hirsitism, historically very very occasional mild acne, my weight has been stable. No family history of DM, PCOS.
I am due to see the fertility NP on Monday, and reached out to schedule appt with my obgyn about this. However, I am so puzzled by this, I seek the wisdom of online community.
Most specifically, what would cause this shocking rise? Do I have pcos; does it onset like this?
Thank you for your thoughts. My head is spinning and I appreciate any input.
submitted by ellaly to AskDocs [link] [comments]


2024.06.01 13:56 genericusername1904 H.G. WELLS’S, THE SHAPE OF THINGS TO COME (1933) VS. 1984 AND BRAVE NEW WORLD

H.G. WELLS’S, THE SHAPE OF THINGS TO COME (1933) VS. 1984 AND BRAVE NEW WORLD

ID, IX. MAIORES. V, CAL. IUNI. FORTUNA PRIMIGENIA.

I discovered this book by complete chance last year – a very old hardback copy was given to me as gift (in a situation which was certainly weighted with the most unlikely of synchronicities), “huh,” I thought, “it’s a first edition of H.G. Wells,” the book itself almost cannot be opened because it is so old and falling apart so I procured a text and audio file of the thing relatively easily and began to read. In hindsight not only for myself but I fancy for the generations of the last fifty years - in all totality, it is deeply strange that this book has not been more widely recognized or taught in schools, as like 1984 and Brave New World, as being the third contender (although technically the second, published one year after Huxley – seemingly written at the same time interestingly enough) in “visions of dystopia” – except that the book is not so much a vision of dystopia tomorrow but a vision of dystopia ‘today’ or rather ‘life as we know it’ of the 19th, 20th and 21st Centuries (endless war, endless pandemics, economic and logistic chaos), narrated from the comfortable and reassuring position of a society far far in the future who have long since revised their culture and solved all of the causes of the problems and become a society of genius polymaths “with (every Man and Woman) the intellectual equal of the polymaths of the ancient world.”
Now, I do not mean here to seem to ‘sweet-talk’ the reader into rushing out and buying this book or to hold it up in the manner of those other books as if it were some ideological blueprint but instead to assay the thing in the natural context which seems to me to be universally unrealized and which presents itself to us as a thing which is plainly self-evident, that is: that in the depressing and miserable dichotomy of 1984 and Brave New World; two extremely atomizing and miserable narratives, that there is also – far more empowering – The Shape Of Things To Come wherein the miserable protagony and antagony of both 1984 and Brave New World might read as merely a footnote somewhere in the middle of the book as an example of the witless measures mankinds old master undertook to preserve their power in an untenable circumstance. In other words, we know all about 1984 as children; we have this drummed into our heads and we glean our cultural comprehension that dictators cannot be cliques of business people but only lone individuals, usually in military uniform, and then we graduate from that to Brave New World to gain a more sophisticated comprehension of the feckless consumerism and ‘passive egoism’ by which our society actually operates, but then we do not – as I argue we ought – continue along in our education with this third book which actually addresses the matters at hand at a more adult level.
For instance, here, from ‘The Breakdown Of Finance And Social Morale After Versailles’ (Book One, Chapter Twelve) addresses in a single paragraph the cause of our continual economic chaos (of which all crime and poverty and war originates from) and highlights the problem from which this chaos cannot be resolved yet could easily be resolved, “adjustment was left to blind and ill-estimated forces,” “manifestly, a dramatic revision of the liberties of enterprise was necessary, but the enterprising people who controlled politics (would be) the very last people to undertake such a revision,”

…the expansion of productive energy was being accompanied by a positive contraction of the distributive arrangements which determined consumption. The more efficient the output, the fewer were the wages-earners. The more stuff there was, the fewer consumers there were. The fewer the consumers, the smaller the trading profits, and the less the gross spending power of the shareholders and individual entrepreneurs. So buying dwindled at both ends of the process and the common investor suffered with the wages- earner. This was the "Paradox of Overproduction" which so troubled the writers and journalists of the third decade of the twentieth century.

It is easy for the young student to-day to ask "Why did they not adjust?" But let him ask himself who there was to adjust. Our modern superstructure of applied economic science, the David Lubin Bureau and the General Directors' Board, with its vast recording organization, its hundreds of thousands of stations and observers, directing, adjusting, apportioning and distributing, had not even begun to exist. Adjustment was left to blind and ill-estimated forces. It was the general interest of mankind to be prosperous, but it was nobody's particular interest to keep affairs in a frame of prosperity. Manifestly a dramatic revision of the liberties of enterprise was necessary, but the enterprising people who controlled politics, so far as political life was controlled, were the very last people to undertake such a revision.

There is a clever metaphor I fancy that Wells worked in to this for the ‘actual’ defacto controlling class of things, that is: not really the politicians (sorry to disappoint the Orwell and conspiracy fans) but instead the ‘Dictatorship of the Air’ which might easily read as the ‘Dictatorship of the Airwaves’ – in colloquial language, that being radio and then television. Certainly we might imagine Rupert Murdoch or Ted Turner or Sumner Redstone (of yesterday) entering into honourable retirement as like the ‘dictators of the air’ of the very last days before the establishment of a one world state – in any case that is how things would work out, as the power of, say, Ted Turner to eradicate a political party in the United States – at any time he wishes – by simply green-lighting coverage of their bad actions relentlessly for months until revolution occurs is a real power of which no other institution possesses nor possesses any means of defence against, i.e. the ‘real power’ in our world to end a war or begin or war or end this or begin that is that power held by the organized press. This metaphor is somewhat of a more mature view, I think, than Wells earlier conception of the press in The Sleeper Awakes (1899) where the press of a dystopian future is visualized as a “babble machine” spreading circular nonsense to preoccupy the citizenry (although this is arguably a true representation of the mental processes of the Twitter and Facebook user, or of the general baby-speak and extremely infantile form of the news reports on the front page of the BBC News website) which is more or less what the press depicted as being in Brave New World also.
However the construction of sudden new realities (or sudden ‘actualities’) presented by the equation of interdependent technological innovations (i.e. the radio and the television in this instance) is mentioned early on in The Shape Of Things To Come in ‘How The Idea And Hope Of The Modern World State First Appeared’ (Book One, Chapter Two),

The fruitlessness of all these premature inventions is very easily explained. First in the case of the Transatlantic passage; either the earlier navigators who got to America never got back, or, if they did get back, they were unable to find the necessary support and means to go again before they died, or they had had enough of hardship, or they perished in a second attempt. Their stories were distorted into fantastic legends and substantially disbelieved. It was, indeed, a quite futile adventure to get to America until the keeled sailing ship, the science of navigation, and the mariner's compass had been added to human resources. (Then), in the matter of printing, it was only when the Chinese had developed the systematic manufacture of abundant cheap paper sheets in standard sizes that the printed book—and its consequent release of knowledge—became practically possible. Finally the delay in the attainment of flying was inevitable because before men could progress beyond precarious gliding it was necessary for metallurgy to reach a point at which the internal combustion engine could be made. Until then they could build nothing strong enough and light enough to battle with the eddies of the air.

In an exactly parallel manner, the conception of one single human community organized for collective service to the common weal had to wait until the rapid evolution of the means of communication could arrest and promise to defeat the disintegrative influence of geographical separation. That rapid evolution came at last in the nineteenth century, and it has been described already in a preceding chapter of this world history. Steam power, oil power, electric power, the railway, the steamship, the aeroplane, transmission by wire and aerial transmission followed each other very rapidly. They knit together the human species as it had never been knit before. Insensibly, in less than a century, the utterly impracticable became not merely a possible adjustment but an urgently necessary adjustment if civilization was to continue.

In other words, then, a global state (or, rather, such power in general held by the press as I see the analogy extending to them as being the ‘Dictatorship of the Airwaves’) was impossible to imagine and completely laughable before the technologies had stacked together to reveal as like in a simple piece of arithmetic which produced a single outcome of the equation; that no sooner had the technologies existed then the thing had become an actual reality – in that 1) unassailable political power had been unthinkingly dropped into the lap of the owners of the press, but that more importantly as consequence that therefore 2) mankind was subject to that power, that is: the situation existed the moment the technologies did – and this whether any living person had even realized it, as I think quite naturally all the time Men and Women invent things that they really have no notion of the fullest or most optimal uses of (“nothing is needed by fools, for: they do not understand how to use anything but are in want of everything,” Chrysippus), e.g. in no metaphor the television was quite literally invented as a ‘ghost box’ to commune with ghosts imagined to reveal themselves by manipulating the black and white of the static until someone else had the idea that there was at least one other use for that contraption.
It is quite strange, also, that in contemporary times we have for ages been heavily propagandized ‘against’ the idea of a “one world state” as if, say, all the crimes and fecklessness that have gone on in our lifetimes are somehow secretly building towards the creation of such a thing – not a thing you would naturally conclude from an observation of those events nor a thing advocated for by anybody (insofar as I have ever heard) but it is a thing which would be the first logical response to ‘preventing’ such crimes from ever occurring again – such as like the already widely practiced concept of a Senate-Style Federation of Sovereign States rather than a hundred or so mutually antagonistic polities capable of bombing themselves or screwing up their economies and creating waves of refugees or mass starvation or pandemics, and so on. For instance, All Egypt is dependent on the flow of the Nile which originates in what is today another country, that other country recently decimated the flow of the Nile by gumming up the Nile with a Hydroelectric Dam; such an outcome would not occur if the total mass of the land itself was governed as the single interconnected economic and environmental system that it is in physical reality of which, when divided along arbitrary borderlines, there is no means to govern the entirety of the region in an amicable and prosperous manner for all as a whole and no recourse to the otherwise intolerable situation but War which is unlikely to occur – as most Nations are comprised of civilized peoples who rightly loath the concept of War – but it is the single and unavoidable outcome to resolve such a situation until that situation has dragged on for decades, causing immense suffering, until it reaches that point of desperation – the matter of Palestine and Israel, fresh to my mind in these days, raises itself also.
Of the matter of War itself, in ‘The Direct Action Of The Armament Industries In Maintaining War Stresses’ (Book One, Chapter Eleven), Wells relays in 1933 what United States President Eisenhower would later remark in 1961 in his farewell address of the dangers of the Military Industrial Complex; albeit far more analytically on Wells part, that: it is not so much the ‘desire to harm’ on the part of the armament industries which sees them engage in unnecessary build-up of weapons stockpiles but that it is simply their business to produce, to stockpile, produce more deadly variants and stockpile the more deadly variants and sell off their old stockpiles to whomsoever rings their doorbell; for instance the on-going War in Ukraine is no different in this regard to the Viet Cong and NATO Warfare in Vietnam in that massive quantities of cheap munitions were necessary for the war to be fought in the first place and massive quantities of munitions happened to exist as a by-product of the Armaments Industries to be dumped onto the warring parties in order to facilitate their macabre impulses at the expense of the citizenry; both at their cost in terms of the debt taken on to procure the weaponry on the part of their governments and in terms of their lives when the weaponry was utilized to the outcome of massive loss of life of a single peoples within a bordered space – a thing of no value to themselves. Simply put, albeit in a very simplistic reduction to the bare basics: the War would not reached such catastrophic inhuman proportions without massive quantities of cheap Armaments that otherwise sat taking up warehouse space for more valuable Armaments on the part of the producer and seller.

In a perpetual progress in the size and range of great guns, in a vast expansion of battleships that were continually scrapped in favour of larger or more elaborate models, (Armament Firms) found a most important and inexhaustible field of profit. The governments of the world were taken unawares, and in a little while the industry, by sound and accepted methods of salesmanship, was able to impose its novelties upon these ancient institutions with their tradition of implacable mutual antagonism. It was realized very soon that any decay of patriotism and loyalty would be inimical to this great system of profits, and the selling branch of the industry either bought directly or contrived to control most of the great newspapers of the time, and exercised a watchful vigilance on the teaching of belligerence in schools. Following the established rules and usages for a marketing industrialism, and with little thought of any consequences but profits, the directors of these huge concerns built up the new warfare that found its first exposition in the Great War of 1914-18, and gave its last desperate and frightful convulsions in the Polish wars of 1940 and the subsequent decades.

Even at its outset in 1914-18 this new warfare was extraordinarily uncongenial to humanity. It did not even satisfy man's normal combative instincts. What an angry man wants to do is to beat and bash another living being, not to be shot at from ten miles distance or poisoned in a hole. Instead of drinking delight of battle with their peers, men tasted all the indiscriminating terror of an earthquake. The war literature stored at Atacama, to which we have already referred, is full of futile protest against the horror, the unsportsmanlike quality, the casual filthiness and indecency, the mechanical disregard of human dignity of the new tactics. But such protest itself was necessarily futile, because it did not go on to a clear indictment of the forces that were making, sustaining and distorting war. The child howled and wept and they did not even attempt to see what it was had tormented it.

To us nowadays it seems insane that profit-making individuals and companies should have been allowed to manufacture weapons and sell the apparatus of murder to all comers. But to the man of the late nineteenth and early twentieth centuries it seemed the most natural thing in the world. It had grown up in an entirely logical and necessary way, without any restraint upon the normal marketing methods of peace-time commerce, from the continually more extensive application of new industrial products to warfare. Even after the World War catastrophe, after that complete demonstration of the futility of war, men still allowed themselves to be herded like sheep into the barracks, to be trained to consume, and be consumed, by new lines of slaughter goods produced and marketed by the still active armament traders. And the accumulation of a still greater and still more dangerous mass of war material continued.

The book is, if the reader has likely already gathered from the excerpts, not written in the style of a protagonal narrative; i.e. not as a story, i.e. no hero and no villain, but as a sort of a Historia Augusta – that is really the most fitting comparison I think of when trying to describe this to a new reader (or perhaps J.J. Scarisbrick’s Henry VIII), that is to say it is written ‘as’ a History in the classical style we are familiar with from the better of the ancient writers, as like Appian or Cassius Dio, but unlike Suetonius or Tacitus it is absent of the sloppy hinging of all bad things on the highly personalized propaganda ad hominem (i.e. blame the fall of empire on one guy) that goes in those narrative works as we are typically familiar with them.
It is, of course, a work a fiction; although Wells did predict World War Two beginning in late 1939-1940 (although he had Poland putting up much better and longer of a fight against the Germans) and various other innovations, beginning from his own day with a true account of events prior to his own day – giving us a valuable account of affairs and actors prior to 1933 which would otherwise not come easily to any of us to discover. But the book, ultimately, is vehicle for the transmission and discussion of these societal (i.e. social, economic, industrial, logistic) matters presented to the audience of the day fresh, in their own minds, from the abject horror recently witnessed in World War One – and the economic catastrophes of which Roosevelts reforms had not yet come into tangible reality (i.e. relief for the poor, public works projects such as the motorways across America) as is discussed in that other seemingly little known H.G. Wells literary offering in his face-to-face interview with Josef Stalin the following year in 1934 (something which I think is of far more historical value than say, Nixon and Frost or Prince Andrew and Emily Maitlis), so as to ‘avert’ another crisis and pluck from the ether a seemingly alternate trajectory of where Mankind might at last get its act together. This ‘novel’ (thought it seems strange to call it that) ought be read, I would advise, in conjunction with ‘The Sleeper Awakes’ (1899) and also the (actually very depressing – I would not advise it) short-story prequel ‘A Story Of The Days To Come’ (1897) – set in that same universe – which, perhaps it is because I am English, seems to me to be a black horror show of the reality that we actually find ourselves living in this far into an actually dystopic future – or perhaps yet with the ‘strange windmills’ powering the mega cities that this a future yet to come (no pun intended); the broken speech, the babble machines, the miserable condition of the Working Class and their consumption of pre-packaged soft bread, the desire to flee the urban sprawl into the dilapidated countryside and make a little life in a run-down house with tacky wallpaper peeling away … ah, forgive me, my point is that ‘our condition’; i.e. those of us literate in English, is quite analogous to the condition of the central characters in those two stories; a culture dulled intellectually to the point that they can barely speak or think, being appraised and assayed by ourselves; those of us simply literate, as to render our commentary stuck as to seem as mutually alien as like Caesar in Gaul. However, it is in the context of the frame given to us in ‘The Shape Of Things To Come’ that we might gain a degree of sanity about this self-same situation; to study and lean into that dispassionate quality as to discern the nature of things as they are and recognize how important this quality is in relation to Well’s ultimate outcome for the best possible position of Humankind far far future, that is: that of Humankind’s vital intellectual capacity, and that the most striking message of STC, beyond all we have mentioned in this little overview, is that intellectual capacity in and of itself.
For example, when we consider the ‘actuality’ of the power of Turner or perhaps Zuckerberg in his heyday, for instance, we consider a power fallen into a Mans lap by an accidental stacking of disparate technologies created not by himself but of which possess a power utterly dependent in that same equation upon on a population being ‘witless’ in the first place and so led slavishly by the “babble machines”. However you cut it, reader, the great uplifting of Humankind to a standard of autonomy and intellectual prowess – not held by an elite but possessed by All People – is a thing both intrinsically self-sufficient within our grasp for our own selves and is certainly the prerequisite for political matters in that intellectual capacity of the voting public determines entirely whether a public is tricked or foolish and gets themselves into trouble by undertaking some obvious error or whether they are immune to such trickery and foolishness in the first place and that their energies and time are spent on more valuable pursuits. It seems to me that our contemporary society has done away with the notion of good character through intellect and that we live with the outcome of this; being shepherded by emotional manipulation and brute force because our society at large is treated as if we lacked the verbal and intellectual toolsets to understand anything else – moreover possessing no means to discern whether or not what is forced onto us is right or wrong; truth or lies, and so on. Such a society as this, again it seems plain to me, is ‘any’ dystopia because it is the baseline composition for ‘all’ dystopia; as like the foolish dogma of an out-dated ideology for example rests itself upon a large enough contingent of the public being either treated as if they were or in fact are “too foolish” to discuss or think a thing through, so a dogma is poured over them like concrete creating, in turn, intolerable circumstances as the dogma, tomorrow, becomes out-dated and suddenly instructs them to do foolish things, as like in the “Banality Of Evil” (read: Hannah Arendt) as the character in all serious perpetrators of inhumanity who insist, with a confused expression on their faces, that they were just doing their job – and this ‘quality’, of extreme ignorance, is the composition of the culture where such ‘evil actions’ occur.
I mean here that in STC we have on one hand a very in-depth account, very serious reading, to graduate the reader out of the depressive, atomizing, disempowering, conspiratorial milieu and mire of ‘life’ presented to us in 1984 and Brave New World, but that we have at the same time the very resonant harmonics that one does not need to “wait around for a distant future utopia” to “solve all the problems” but that the tools to do so are well within our grasp at any time we so choose and of which such an undertaking constitutes the foundation stones and tapestries of that future utopia which, I think, could be said to “meet us half-way” in many of these matters, as like we reach forward and they reach back and then those in the past reach forward and we in the present reach back; that is anyway what it is to learn from the past and anyway the answer to “why the Grandfather sews the seeds for trees from whose fruits he will never eat.”
Valete.

ID, IX. MAIORES. V, CAL. IUNI. FORTUNA PRIMIGENIA.

FULL TEXT ON GUTENBERG OF H.G. WELLS ‘THE SHAPE OF THINGS TO COME’ (1933)
https://preview.redd.it/9l7yl9hx8y3d1.jpg?width=490&format=pjpg&auto=webp&s=4d5a4109fb8e2193b94a6e244d92d4ec5b7b84a7
https://preview.redd.it/37vvsroy8y3d1.jpg?width=740&format=pjpg&auto=webp&s=e62ef5e11c1c4222d6f99ffebe82b3dd706cbc2f
submitted by genericusername1904 to 2ndStoicSchool [link] [comments]


2024.06.01 13:46 lentildaswinton At the end of my tether

I would never normally turn to Reddit but I’ve reached the end of my tether. I’m an ex-nurse so medically, I’m good - I don’t need medical advice, more an outlet to rant and get some support. If you have nothing nice to say, please don’t bother; my mental health is incredibly fragile right now.
I’m 34 next month, and I started my period for the first time at 8 years old (it was apparently triggered by CSA).
From the moment it started until now, it has been HORRIFIC. Waking up in pools of blood, barely able to leave the house without bleeding everywhere - never been able to wear skirts, shorts, or anything white. There’s never been a pattern, some periods would last for 3 months, sometimes it would disappear for a year.
I was diagnosed with PCOS in 2017 and was put on metformin. This did nothing to help.
In 2018, I had weight loss surgery as my bmi was (and still is) out of range for IVF. I have had 11 miscarriages - 4 of them were when my weight exceeded 28 stone, 5 were when my weight was at 18 stone, 2 at 16 stone, and 4 when my weight increased slightly.
I have been tested for natural killer cells, I’ve been tested for thrombocytopenia, I’ve been tested for EVERYTHING - nothing has come up except a dodgy thyroid (which I’m on medication to fix).
Anyway, after my surgery, my periods relaxed a bit and I started having one every three months or so which lasted about 3 weeks. Everything was okay until - and I am NOT anti-vax in the slightest! - I had my astrazeneca jab on 6.2.21 - literalky a day later, i bled until 12.6.23. I bled, heavily, every single day for over two years.
I was referred to gynaecology in March 21 and rhey didnt see me until August 22 when I was blue-lighted to a&e with severe iron-deficiency (iron levels 2) where they did a biopsy and a scan. The scan showed thickened endometrium and the biopsy was normal.
I ended up in hospital with an adrenal crisis and the bleeding still wouldnt stop. They gave me tranexamic acid, mefanamic acid, norethisterone - absolutely NOTHING would stop the bleeding. Eventually they discharged me and sent for an emergency iron infusion.
Fast forward a few months and the same thing happens again. Back to hospital, another iron infusion, still no medication would work. They did another biopsy - nothing.
This happened again every few months for about a year which takes us up to last June when I had another biopsy which came back showing “endometrial hyperplasia” but I received a letter from the specialist who said everything was normal. I tried to clarify with them but they weren’t sure what was going on. So I asked for a second opinion at a different hospital.
The new hospital did a biopsy which came back normal (this was Feb 24). After the biopsy, I stopped bleeding. It finally stopped! Until March came along and then I started bleeding AGAIN - I am STILL bleeding now. I’m off for another iron infusion at 2:30 today because my iron stores have dropped to 2 again and I’m at my wits end.
I’ve had 7 iron infusions in 2 years, no medication helps, no dietary changes help, no vitamin, mineral or nutritional deficiencies are documented. I eat very healthily, I exercise and I do everything I can to support my hormonal health (naturally, no additional supplementation aside from folic acid and folate).
I’m booked in to have the mirena coil fitted (again) at the end of June (apparently no sooner slots) but I’m losing the will to live. I genuinely cannot continue waking up like this day after day. I haven’t been able to work for four years because this is overruling my life.
They won’t do a hysterectomy “in case” I want to have children, they won’t investigate for endometriosis because I have “too much scar tissue” and they won’t check my egg quality because my BMI is just over 35. I’m working to lose weight (even though I lost 12 stone through weight loss surgery) but my body has plateaued and I can’t seem to lose anymore. My endocrinologist is scratching his head, I’m at a loss, and I’m genuinely fearful that if this continues much longer, I might end up taking drastic action.
submitted by lentildaswinton to Periods [link] [comments]


2024.06.01 13:44 Jahcrsde My game crashes when clicking on any button

 ---- Minecraft Crash Report ---- // You should try our sister game, Minceraft! Time: 2024-06-01 06:30:21 Description: mouseClicked event handler java.lang.NoSuchMethodError: 'net.minecraft.class_339 net.minecraft.class_442.method_25411(net.minecraft.class_339)' at net.minecraft.class_442.handler$dom000$gbfabrictools$addConfigScreen(class_442.java:1530) at net.minecraft.class_442.method_25426(class_442.java:156) at net.minecraft.class_437.method_25423(class_437.java:297) at net.minecraft.class_310.method_1507(class_310.java:1080) at net.minecraft.class_8032.method_49296(class_8032.java:98) at net.minecraft.class_8032.method_25419(class_8032.java:90) at net.minecraft.class_8032.method_48639(class_8032.java:74) at net.minecraft.class_4185.method_25306(class_4185.java:94) at net.minecraft.class_4264.method_25348(class_4264.java:56) at net.minecraft.class_339.method_25402(class_339.java:189) at net.minecraft.class_4069.method_25402(class_4069.java:38) at net.minecraft.class_312.method_1611(class_312.java:98) at net.minecraft.class_437.method_25412(class_437.java:409) at net.minecraft.class_312.method_1601(class_312.java:98) at net.minecraft.class_312.method_22686(class_312.java:169) at net.minecraft.class_1255.execute(class_1255.java:102) at net.minecraft.class_312.method_22684(class_312.java:169) at org.lwjgl.glfw.GLFWMouseButtonCallbackI.callback(GLFWMouseButtonCallbackI.java:43) at org.lwjgl.system.JNI.invokeV(Native Method) at org.lwjgl.glfw.GLFW.glfwPollEvents(GLFW.java:3403) at com.mojang.blaze3d.systems.RenderSystem.pollEvents(RenderSystem.java:201) at com.mojang.blaze3d.systems.RenderSystem.flipFrame(RenderSystem.java:219) at net.minecraft.class_1041.method_15998(class_1041.java:288) at net.minecraft.class_310.method_1523(class_310.java:1241) at net.minecraft.class_310.method_1514(class_310.java:802) at net.minecraft.client.main.Main.main(Main.java:250) at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:470) at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:74) at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) A detailed walkthrough of the error, its code path and all known details is as follows: --------------------------------------------------------------------------------------- -- Head -- Thread: Render thread Stacktrace: at net.minecraft.class_442.handler$dom000$gbfabrictools$addConfigScreen(class_442.java:1530) at net.minecraft.class_442.method_25426(class_442.java:156) at net.minecraft.class_437.method_25423(class_437.java:297) at net.minecraft.class_310.method_1507(class_310.java:1080) at net.minecraft.class_8032.method_49296(class_8032.java:98) at net.minecraft.class_8032.method_25419(class_8032.java:90) at net.minecraft.class_8032.method_48639(class_8032.java:74) at net.minecraft.class_4185.method_25306(class_4185.java:94) at net.minecraft.class_4264.method_25348(class_4264.java:56) at net.minecraft.class_339.method_25402(class_339.java:189) at net.minecraft.class_4069.method_25402(class_4069.java:38) at net.minecraft.class_312.method_1611(class_312.java:98) at net.minecraft.class_437.method_25412(class_437.java:409) at net.minecraft.class_312.method_1601(class_312.java:98) at net.minecraft.class_312.method_22686(class_312.java:169) at net.minecraft.class_1255.execute(class_1255.java:102) at net.minecraft.class_312.method_22684(class_312.java:169) at org.lwjgl.glfw.GLFWMouseButtonCallbackI.callback(GLFWMouseButtonCallbackI.java:43) at org.lwjgl.system.JNI.invokeV(Native Method) at org.lwjgl.glfw.GLFW.glfwPollEvents(GLFW.java:3403) at com.mojang.blaze3d.systems.RenderSystem.pollEvents(RenderSystem.java:201) at com.mojang.blaze3d.systems.RenderSystem.flipFrame(RenderSystem.java:219) -- Affected screen -- Details: Screen name: net.minecraft.class_8032 Stacktrace: at net.minecraft.class_437.method_25412(class_437.java:409) at net.minecraft.class_312.method_1601(class_312.java:98) at net.minecraft.class_312.method_22686(class_312.java:169) at net.minecraft.class_1255.execute(class_1255.java:102) at net.minecraft.class_312.method_22684(class_312.java:169) at org.lwjgl.glfw.GLFWMouseButtonCallbackI.callback(GLFWMouseButtonCallbackI.java:43) at org.lwjgl.system.JNI.invokeV(Native Method) at org.lwjgl.glfw.GLFW.glfwPollEvents(GLFW.java:3403) at com.mojang.blaze3d.systems.RenderSystem.pollEvents(RenderSystem.java:201) at com.mojang.blaze3d.systems.RenderSystem.flipFrame(RenderSystem.java:219) at net.minecraft.class_1041.method_15998(class_1041.java:288) at net.minecraft.class_310.method_1523(class_310.java:1241) at net.minecraft.class_310.method_1514(class_310.java:802) at net.minecraft.client.main.Main.main(Main.java:250) at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:470) at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:74) at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) -- Last reload -- Details: Reload number: 1 Reload reason: initial Finished: Yes Packs: vanilla, fabric, Moonlight Mods Dynamic Assets, Essential Assets, essential Stacktrace: at net.minecraft.class_6360.method_36565(class_6360.java:49) at net.minecraft.class_310.method_1587(class_310.java:2413) at net.minecraft.class_310.method_1514(class_310.java:821) at net.minecraft.client.main.Main.main(Main.java:250) at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:470) at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:74) at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) -- System Details -- Details: Minecraft Version: 1.20.1 Minecraft Version ID: 1.20.1 Operating System: Windows 11 (amd64) version 10.0 Java Version: 17.0.8, Microsoft Java VM Version: OpenJDK 64-Bit Server VM (mixed mode), Microsoft Memory: 16734346176 bytes (15959 MiB) / 24092082176 bytes (22976 MiB) up to 40869298176 bytes (38976 MiB) CPUs: 16 Processor Vendor: AuthenticAMD Processor Name: AMD Ryzen 7 5800X 8-Core Processor Identifier: AuthenticAMD Family 25 Model 33 Stepping 2 Microarchitecture: Zen 3 Frequency (GHz): 4.20 Number of physical packages: 1 Number of physical CPUs: 8 Number of logical CPUs: 16 Graphics card #0 name: Virtual Desktop Monitor Graphics card #0 vendor: Virtual Desktop, Inc. Graphics card #0 VRAM (MB): 0.00 Graphics card #0 deviceId: unknown Graphics card #0 versionInfo: DriverVersion=10.54.50.446 Graphics card #1 name: Parsec Virtual Display Adapter Graphics card #1 vendor: Parsec Cloud, Inc. Graphics card #1 VRAM (MB): 0.00 Graphics card #1 deviceId: unknown Graphics card #1 versionInfo: DriverVersion=0.45.0.0 Graphics card #2 name: NVIDIA GeForce RTX 4090 Graphics card #2 vendor: NVIDIA (0x10de) Graphics card #2 VRAM (MB): 24095.00 Graphics card #2 deviceId: 0x2684 Graphics card #2 versionInfo: DriverVersion=32.0.15.5585 Memory slot #0 capacity (MB): 16384.00 Memory slot #0 clockSpeed (GHz): 3.60 Memory slot #0 type: DDR4 Memory slot #1 capacity (MB): 16384.00 Memory slot #1 clockSpeed (GHz): 3.60 Memory slot #1 type: DDR4 Memory slot #2 capacity (MB): 16384.00 Memory slot #2 clockSpeed (GHz): 3.60 Memory slot #2 type: DDR4 Memory slot #3 capacity (MB): 16384.00 Memory slot #3 clockSpeed (GHz): 3.60 Memory slot #3 type: DDR4 Virtual memory max (MB): 85532.29 Virtual memory used (MB): 51955.71 Swap memory total (MB): 20096.00 Swap memory used (MB): 74.20 JVM Flags: 4 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xss1M -Xmx38976m -Xms256m Fabric Mods: ad_astra: Ad Astra 1.15.5 ad_astra_giselle_addon: Ad Astra: Giselle Addon 5.6 additionallanterns: Additional Lanterns 1.1.1a advancednetherite: Advanced Netherite 2.1.0-1.20.1 adventurez: AdventureZ 1.4.20 alloy_forgery: Alloy Forgery 2.1.2+1.20 another_furniture: Another Furniture 1.20.1-3.0.1 appleskin: AppleSkin 2.5.1+mc1.20 archers: Archers (RPG Series) 1.2.1+1.20.1 com_github_zsoltmolnarrr_tinyconfig: TinyConfig 2.3.2 structure_pool_api: Structure Pool API 1.0+1.20.1 architectury: Architectury 9.2.14 archon: Archon 0.6.2 cardinal-components-base: Cardinal Components API (base) 5.2.2 cardinal-components-entity: Cardinal Components API (entities) 5.2.2 saflib: SafLib 1.1.0 artifacts: Artifacts 9.5.7 expandability: ExpandAbility 9.0.4 step-height-entity-attribute: Step Height Entity Attribute 1.2.0 attributefix: AttributeFix 21.0.4 azurelibarmor: AzureLib Armor 2.0.3 backpacked: Backpacked 3.0.0-beta.2 mm: Manningham Mills 2.3 bcc: BetterCompatibilityChecker 4.0.8 bclib: BCLib 3.0.14 wunderlib: WunderLib 1.1.5 beaconoverhaul: Beacon Overhaul 1.8.4+1.20 reach-entity-attributes: Reach Entity Attributes 2.4.0 betterdeserttemples: YUNG's Better Desert Temples 1.20-Fabric-3.0.3 org_reflections_reflections: reflections 0.10.2 betterdungeons: YUNG's Better Dungeons 1.20-Fabric-4.0.4 betterend: Better End 4.0.11 betterendisland: YUNG's Better End Island 1.20-Fabric-2.0.6 betterfortresses: YUNG's Better Nether Fortresses 1.20-Fabric-2.0.6 bettermineshafts: YUNG's Better Mineshafts 1.20-Fabric-4.0.4 betternether: Better Nether 9.0.10 betteroceanmonuments: YUNG's Better Ocean Monuments 1.20-Fabric-3.0.4 betterstrongholds: YUNG's Better Strongholds 1.20-Fabric-4.0.3 betterthirdperson: Better Third Person 1.9.0 betterwitchhuts: YUNG's Better Witch Huts 1.20-Fabric-3.0.3 biomemusic: Biome Music Mod 1.20.1-2.3 blur: Blur (Fabric) 3.1.0 midnightlib: MidnightLib 1.4.1 satin: Satin 1.13.0 bookshelf: Bookshelf 20.1.10 bosses_of_mass_destruction: Bosses of Mass Destruction (Beta) 1.7.5-1.20.1 maelstrom_library: Maelstrom Library 1.6.1-1.20 multipart_entities: MultipartEntities 1.5-1.20 botanypots: BotanyPots 13.0.33 botanytrees: BotanyTrees 9.0.11 botarium: Botarium 2.3.3 team_reborn_energy: Energy 3.0.0 bountiful: Bountiful 6.0.3+1.20.1 cardinal-components: Cardinal Components API 5.2.2 cardinal-components-block: Cardinal Components API (blocks) 5.2.2 cardinal-components-chunk: Cardinal Components API (chunks) 5.2.2 cardinal-components-item: Cardinal Components API (items) 5.2.2 cardinal-components-level: Cardinal Components API (world saves) 5.2.2 cardinal-components-scoreboard: Cardinal Components API (scoreboard) 5.2.2 cardinal-components-world: Cardinal Components API (worlds) 5.2.2 charmofundying: Charm of Undying 6.5.0+1.20.1 spectrelib: SpectreLib 0.13.15+1.20.1 chefsdelight: Chefs Delight 1.0.3-fabric-1.20.1 chimes: Chimes 2.0.1 cloth-config: Cloth Config v11 11.1.118 cloth-basic-math: cloth-basic-math 0.6.1 clumps: Clumps 12.0.0.4 collective: Collective 7.61 combatroll: Combat Roll 1.3.2+1.20.1 comforts: Comforts 6.3.5+1.20.1 continuity: Continuity 3.0.0-beta.5+1.20.1 coroutil: CoroUtil 1.20.1-1.3.7 cristellib: Cristel Lib 1.1.5 blue_endless_jankson: jankson 1.2.3 croptopia: Croptopia 3.0.3 ctov: ChoiceTheorem's Overhauled Village 3.4.3 culinaryconstruct: Culinary Construct 5.2.1+1.20.1 cupboard: cupboard 1.20.1-2.6 darkpaintings: DarkPaintings 17.0.4 darkutils: DarkUtilities 17.0.3 decorative_blocks: Decorative Blocks 4.1.3 deeperdarker: Deeper and Darker 1.2.6 customportalapi: Custom Portal Api 0.0.1-beta64-1.20 dimdoors: DimensionalDoors 5.3.5 com_flowpowered_flow-math: flow-math 1.0.3 com_github_dimensionaldevelopment_poly2tri_java: poly2tri.java 0.1.1 org_jgrapht_jgrapht-core: jgrapht-core 1.1.0 distanthorizons: Distant Horizons 2.0.4-a-dev doubledoors: Double Doors 5.7 dragonfight: Dragonfight Mod 1.20.1-4.5 dummmmmmy: MmmMmmMmmMmm 1.20-1.8.17b dungeonnowloading: Dungeon Now Loading 1.5 dungeons_arise: When Dungeons Arise 2.1.58 dungeons_arise_seven_seas: When Dungeons Arise: Seven Seas 1.0.2 durabilitytooltip: Durability Tooltip 1.1.5 dynamictrim: DynamicTrim 1.4.1 mixinsquared: MixinSquared 0.1.1 easymagic: Easy Magic 8.0.1 ecologics: Ecologics 2.2.0 elementa: Elementa 647 elytraslot: Elytra Slot 6.3.0+1.20.1 enchantedlib: Enchanted Lib 0.3.1 enchdesc: EnchantmentDescriptions 17.0.15 endrem: End Remastered 5.2.4 ends_delight: End's Delight refabricated-1.20.1-alpha-1.0 epherolib: EpheroLib 1.2.0 essential: Essential 1.3.2.5+ge4fdbcd438 essential-container: essential-container 1.0.0 essential-loader: essential-loader 1.2.3 everycomp: Every Compat 1.20-2.6.56 porting_lib_tags: Porting Lib Tags 3.0 expandeddelight: Expanded Delight 0.3.1 omega-config: OmegaConfig 1.4.0+1.20.1 explorify: Explorify v1.4.0 fabric-api: Fabric API 0.92.0+1.20.1 fabric-api-base: Fabric API Base 0.4.31+1802ada577 fabric-api-lookup-api-v1: Fabric API Lookup API (v1) 1.6.36+1802ada577 fabric-biome-api-v1: Fabric Biome API (v1) 13.0.13+1802ada577 fabric-block-api-v1: Fabric Block API (v1) 1.0.11+1802ada577 fabric-block-view-api-v2: Fabric BlockView API (v2) 1.0.1+1802ada577 fabric-blockrenderlayer-v1: Fabric BlockRenderLayer Registration (v1) 1.1.41+1802ada577 fabric-client-tags-api-v1: Fabric Client Tags 1.1.2+1802ada577 fabric-command-api-v1: Fabric Command API (v1) 1.2.34+f71b366f77 fabric-command-api-v2: Fabric Command API (v2) 2.2.13+1802ada577 fabric-commands-v0: Fabric Commands (v0) 0.2.51+df3654b377 fabric-containers-v0: Fabric Containers (v0) 0.1.64+df3654b377 fabric-content-registries-v0: Fabric Content Registries (v0) 4.0.11+1802ada577 fabric-convention-tags-v1: Fabric Convention Tags 1.5.5+1802ada577 fabric-crash-report-info-v1: Fabric Crash Report Info (v1) 0.2.19+1802ada577 fabric-data-attachment-api-v1: Fabric Data Attachment API (v1) 1.0.0+de0fd6d177 fabric-data-generation-api-v1: Fabric Data Generation API (v1) 12.3.4+1802ada577 fabric-dimensions-v1: Fabric Dimensions API (v1) 2.1.54+1802ada577 fabric-entity-events-v1: Fabric Entity Events (v1) 1.6.0+1c78457f77 fabric-events-interaction-v0: Fabric Events Interaction (v0) 0.6.2+1802ada577 fabric-events-lifecycle-v0: Fabric Events Lifecycle (v0) 0.2.63+df3654b377 fabric-game-rule-api-v1: Fabric Game Rule API (v1) 1.0.40+1802ada577 fabric-item-api-v1: Fabric Item API (v1) 2.1.28+1802ada577 fabric-item-group-api-v1: Fabric Item Group API (v1) 4.0.12+1802ada577 fabric-key-binding-api-v1: Fabric Key Binding API (v1) 1.0.37+1802ada577 fabric-keybindings-v0: Fabric Key Bindings (v0) 0.2.35+df3654b377 fabric-lifecycle-events-v1: Fabric Lifecycle Events (v1) 2.2.22+1802ada577 fabric-loot-api-v2: Fabric Loot API (v2) 1.2.1+1802ada577 fabric-loot-tables-v1: Fabric Loot Tables (v1) 1.1.45+9e7660c677 fabric-message-api-v1: Fabric Message API (v1) 5.1.9+1802ada577 fabric-mining-level-api-v1: Fabric Mining Level API (v1) 2.1.50+1802ada577 fabric-model-loading-api-v1: Fabric Model Loading API (v1) 1.0.3+1802ada577 fabric-models-v0: Fabric Models (v0) 0.4.2+9386d8a777 fabric-networking-api-v1: Fabric Networking API (v1) 1.3.11+1802ada577 fabric-networking-v0: Fabric Networking (v0) 0.3.51+df3654b377 fabric-object-builder-api-v1: Fabric Object Builder API (v1) 11.1.3+1802ada577 fabric-particles-v1: Fabric Particles (v1) 1.1.2+1802ada577 fabric-recipe-api-v1: Fabric Recipe API (v1) 1.0.21+1802ada577 fabric-registry-sync-v0: Fabric Registry Sync (v0) 2.3.3+1802ada577 fabric-renderer-api-v1: Fabric Renderer API (v1) 3.2.1+1802ada577 fabric-renderer-indigo: Fabric Renderer - Indigo 1.5.1+1802ada577 fabric-renderer-registries-v1: Fabric Renderer Registries (v1) 3.2.46+df3654b377 fabric-rendering-data-attachment-v1: Fabric Rendering Data Attachment (v1) 0.3.37+92a0d36777 fabric-rendering-fluids-v1: Fabric Rendering Fluids (v1) 3.0.28+1802ada577 fabric-rendering-v0: Fabric Rendering (v0) 1.1.49+df3654b377 fabric-rendering-v1: Fabric Rendering (v1) 3.0.8+1802ada577 fabric-resource-conditions-api-v1: Fabric Resource Conditions API (v1) 2.3.8+1802ada577 fabric-resource-loader-v0: Fabric Resource Loader (v0) 0.11.10+1802ada577 fabric-screen-api-v1: Fabric Screen API (v1) 2.0.8+1802ada577 fabric-screen-handler-api-v1: Fabric Screen Handler API (v1) 1.3.30+1802ada577 fabric-sound-api-v1: Fabric Sound API (v1) 1.0.13+1802ada577 fabric-transfer-api-v1: Fabric Transfer API (v1) 3.3.4+1802ada577 fabric-transitive-access-wideners-v1: Fabric Transitive Access Wideners (v1) 4.3.1+1802ada577 fabric-language-kotlin: Fabric Language Kotlin 1.11.0+kotlin.2.0.0 org_jetbrains_kotlin_kotlin-reflect: kotlin-reflect 2.0.0 org_jetbrains_kotlin_kotlin-stdlib: kotlin-stdlib 2.0.0 org_jetbrains_kotlin_kotlin-stdlib-jdk7: kotlin-stdlib-jdk7 2.0.0 org_jetbrains_kotlin_kotlin-stdlib-jdk8: kotlin-stdlib-jdk8 2.0.0 org_jetbrains_kotlinx_atomicfu-jvm: atomicfu-jvm 0.24.0 org_jetbrains_kotlinx_kotlinx-coroutines-core-jvm: kotlinx-coroutines-core-jvm 1.8.1 org_jetbrains_kotlinx_kotlinx-coroutines-jdk8: kotlinx-coroutines-jdk8 1.8.1 org_jetbrains_kotlinx_kotlinx-datetime-jvm: kotlinx-datetime-jvm 0.6.0 org_jetbrains_kotlinx_kotlinx-serialization-cbor-jvm: kotlinx-serialization-cbor-jvm 1.6.3 org_jetbrains_kotlinx_kotlinx-serialization-core-jvm: kotlinx-serialization-core-jvm 1.6.3 org_jetbrains_kotlinx_kotlinx-serialization-json-jvm: kotlinx-serialization-json-jvm 1.6.3 fabricloader: Fabric Loader 0.15.11 mixinextras: MixinExtras 0.3.5 fallingleaves: Falling Leaves 1.15.6 fallingtree: FallingTree 4.3.4 farmersdelight: Farmer's Delight 1.20.1-2.1.1+refabricated porting_lib_accessors: Porting Lib Accessors 2.3.4+1.20.1 porting_lib_base: Porting Lib Base 2.3.4+1.20.1 porting_lib_attributes: Porting Lib Attributes 2.3.4+1.20.1 porting_lib_common: Porting Lib Common 2.3.4+1.20.1 porting_lib_entity: Porting Lib Entity 2.3.4+1.20.1 porting_lib_fluids: Porting Lib Fluids 2.3.4+1.20.1 porting_lib_mixin_extensions: Porting Lib Mixin Extensions 2.3.4+1.20.1 porting_lib_transfer: Porting Lib Transfer 2.3.4+1.20.1 porting_lib_utility: Porting Lib Utility 2.3.4+1.20.1 porting_lib_client_events: Porting Lib Client Events 2.3.4+1.20.1 porting_lib_config: Porting Lib Config 2.3.4+1.20.1 porting_lib_extensions: Porting Lib Extensions 2.3.4+1.20.1 porting_lib_lazy_registration: Porting Lib Lazy Register 2.3.4+1.20.1 porting_lib_loot: Porting Lib Loot 2.3.4+1.20.1 porting_lib_networking: Porting Lib Networking 2.3.4+1.20.1 porting_lib_recipe_book_categories: Porting Lib Recipe Book Categories 2.3.4+1.20.1 porting_lib_registries: Porting Lib Registries 2.3.4+1.20.1 porting_lib_tool_actions: Porting Lib Tool Actions 2.3.4+1.20.1 porting_lib_core: Porting Lib Core 2.3.4+1.20.1 forgeconfigapiport: Forge Config API Port 8.0.0 framework: Framework 0.7.6 com_electronwill_night-config_core: core 3.6.6 com_electronwill_night-config_toml: toml 3.6.6 org_javassist_javassist: javassist 3.29.2-GA friendsandfoes: Friends&Foes 2.0.10 geckolib: GeckoLib 4 4.4.2 com_eliotlash_mclib_mclib: mclib 20 geophilic: Geophilic v2.2.0-mc1.20u1.20.2 goblintraders: Goblin Traders 1.9.3 graveyard: The Graveyard 3.0 guardvillagers: GuardVillagers 2.0.9-1.20.1 handcrafted: Handcrafted 3.0.6 hybrid-aquatic: Hybrid Aquatic 1.3.2 iceberg: Iceberg 1.1.18 illagerinvasion: Illager Invasion 8.0.5 extensibleenums: Extensible Enums 7.0.1 immediatelyfast: ImmediatelyFast 1.2.16+1.20.4 net_lenni0451_reflect: Reflect 1.3.3 immersive_aircraft: Immersive Aircraft 1.0.1+1.20.1 org_mariuszgromada_math_mathparser_org-mxparser: MathParser.org-mXparser 5.2.1 immersive_armors: Immersive Armors 1.6.1+1.20.1 incendium: Incendium 5.3.5 indium: Indium 1.0.30+mc1.20.4 inventorysorter: Inventory Sorter 1.9.0-1.20 kyrptconfig: Kyrpt Config 1.5.6-1.20 iris: Iris 1.7.0+mc1.20.1 io_github_douira_glsl-transformer: glsl-transformer 2.0.0-pre13 org_anarres_jcpp: jcpp 1.4.14 org_antlr_antlr4-runtime: antlr4-runtime 4.11.1 itemborders: Item Borders 1.2.2 jade: Jade 11.8.0 jamlib: JamLib 0.6.1+1.20.x java: OpenJDK 64-Bit Server VM 17 jei: Just Enough Items 15.3.0.4 justenoughbreeding: Just Enough Breeding 1.2.1 justenoughprofessions: Just Enough Professions (JEP) 3.0.1 kambrik: Kambrik 6.1.1+1.20.1 kotori316_version_checker: ForgeLikeVersionChecker 2.4.0 kotori_scala: Scalable Cat's Force Fabric 2.2.0 org_scala-lang_scala-library: scala-library 2.13.12 org_scala-lang_scala3-library_3: scala3-library_3 3.3.1 org_typelevel_cats-core_3: cats-core_3 2.10.0-kotori org_typelevel_cats-free_3: cats-free_3 2.10.0-kotori org_typelevel_cats-kernel_3: cats-kernel_3 2.10.0-kotori lavender: Lavender 0.1.9+1.20 lavender-md: lavender-md 0.1.1+1.20 lavender-md-owo-ui: lavender-md-owo-ui 0.1.1+1.20 leavesbegone: Leaves Be Gone 8.0.0 lootintegrations: Loot integration Mod 1.20.1-3.7 lootr: Lootr 0.7.33.81 magistuarmory: Epic Knights Mod 9.8 magnumtorch: Magnum Torch 8.0.2 majruszlibrary: Majrusz Library 7.0.8 majruszsaccessories: Majrusz's Accessories 1.5.3 majruszsdifficulty: Majrusz's Progressive Difficulty 1.9.10 mcdar: MC Dungeons Artifacts 4.0.3 mcdw: MC Dungeons Weapons 9.0.4 mcwfences: Macaw's Fences and Walls 1.1.1 mcwfurnitures: Macaw's Furniture 3.2.2 mcwlights: Macaw's Lights and Lamps 1.0.6 mcwpaintings: Macaw's Paintings 1.0.5 mcwpaths: Macaw's Paths and Pavings 1.0.5 mcwroofs: Macaw's Roofs 2.3.0 mcwtrpdoors: Macaw's Trapdoors 1.1.3 mcwwindows: Macaw's Windows 2.2.1 mes: Moog's End Structures 1.3.1-1.20-fabric minecraft: Minecraft 1.20.1 mobsunscreen: Mob Sunscreen 3.1.0 modelfix: Model Gap Fix 1.15 moonlight: Moonlight 1.20-2.11.30 more_armor_trims: More Armor Trims 1.2.0 moremobvariants: More Mob Variants 1.3.0.1 moretotems: More Totems 2.16.0 mousetweaks: Mouse Tweaks 2.26 mr_dungeons_andtaverns: Dungeons and Taverns 3.0.3.f mutantmonsters: Mutant Monsters 8.0.7 mvs: Moog's Voyager Structures 4.1.2-1.20-fabric naturalist: Naturalist 4.0.3 netherdepthsupgrade: Nether Depths Upgrade fabric-3.1.6-1.20 nyfsspiders: Nyf's Spiders 2.1.1 oceansdelight: Ocean's Delight fdrf-fabric-1.0.2-1.20 org_jetbrains_annotations: annotations 23.0.0 overloadedarmorbar: Overloaded Armor Bar 1.20.1-2 gbfabrictools: GBfabrictools 1.2.2+1.16 owo: oωo 0.11.2+1.20 paraglider: Paragliders 20.1.3 patchouli: Patchouli 1.20.1-84-FABRIC fiber: fiber 0.23.0-2 phantasm: End's Phantasm 0.3 philipsruins: Philip`s Ruins 1.20.1 pigpen: PigPen 15.0.2 player-animator: Player Animator 1.0.2-rc1+1.20 polymorph: Polymorph 0.49.5+1.20.1 prism: Prism 1.0.5 projectile_damage: Projectile Damage Attribute 3.2.3+1.20.1 puzzleslib: Puzzles Lib 8.1.20 puzzlesaccessapi: Puzzles Access Api 8.0.7 quarryplus: QuarryPlus 20.1.1159 ranged_weapon_api: RangedWeaponAPI 1.1.1+1.20.1 rare-ice: Rare Ice 0.6.0 resourcefulconfig: Resourcefulconfig 2.1.2 resourcefullib: Resourceful Lib 2.1.25 com_teamresourceful_bytecodecs: bytecodecs 1.0.2 com_teamresourceful_yabn: yabn 1.0.3 rightclickharvest: Right Click Harvest 3.2.3+1.19.x-1.20.1-fabric runelic: Runelic 18.0.2 runes: Runes 0.9.11+1.20.1 sawmill: Universal Sawmill 1.20-1.4.1 sdrp: Simple Discord Rich Presence 4.0.3-build.40+mc1.20.1 com_github_jagrosh_discordipc: DiscordIPC a8d6631cc9 com_kohlschutter_junixsocket_junixsocket-common: junixsocket-common 2.6.2 com_kohlschutter_junixsocket_junixsocket-native-common: junixsocket-native-common 2.6.2 org_json_json: json 20210307 simplylight: Simply Light 1.20.1-1.4.5 simplyswords: Simply Swords 1.55.0-1.20.1 spruceui: SpruceUI 5.0.0+1.20 skinlayers3d: 3d-Skin-Layers 1.6.5 smallships: Small Ships 2.0.0-b1.2 smarterfarmers: Smarter Farmers 1.20-1.8.2 sodium: Sodium 0.5.8+mc1.20.1 somanyenchantments: So Many Enchantments Mod 0.4.1 soulsweapons: Marium's Soulslike Weaponry 1.1.3-1.20-fabric sound_physics_remastered: Sound Physics Remastered 1.20.1-1.4.2 spell_engine: Spell Engine 0.14.3+1.20.1 spell_power: Spell Power Attribute 0.10.2+1.20.1 starterkit: Starter Kit 6.7 stoneworks: Stoneworks 8.0.0 structory: Structory 1.3.5 structory_towers: Structory: Towers 1.0.7 structureessentials: Structure Essentials Mod 1.20.1-3.3 supermartijn642configlib: SuperMartijn642's Config Lib 1.1.8+a supermartijn642corelib: SuperMartijn642's Core Lib 1.1.17 supplementaries: Supplementaries 1.20-2.8.11 suppsquared: Supplementaries Squared 1.20-1.1.14 t_and_t: Towns and Towers 1.12 terralith: Terralith 2.5.1 things: Things 0.3.3+1.20 totw_additions: Towers of the Wild: Additions 1.3 totw_modded: Towers Of The Wild: Modded fabric-1.20.1-1.0.5 trashcans: Trash Cans 1.0.18 travelersbackpack: Traveler's Backpack fabric-1.20.1-9.1.13 travelerstitles: Traveler's Titles 1.20-Fabric-4.0.2 treeharvester: Tree Harvester 8.7 trimeffects: Trim Effects 1.1.1-fabric trinkets: Trinkets 3.7.2 twigs: Twigs 3.1.0 universalcraft: UniversalCraft 337 veinmining: Vein Mining 1.4.1+1.20.1 vigilance: Vigilance 297 villagernames: Villager Names 7.3 villagersplus: Villagers Plus 3.1 villagesandpillages: Villages&Pillages 1.0.0 villagespawnpoint: Village Spawn Point 4.2 visuality: Visuality 0.7.1+1.20 visualworkbench: Visual Workbench 8.0.0 watut: What Are They Up To 1.20.1-1.1.1 weaponmaster: YDM's Weapon Master 3.0.5 wirelesschargers: Wireless Chargers 1.0.9+a wizards: Wizards (RPG Series) 1.2.0+1.20.1 yeetusexperimentus: Yeetus Experimentus 2.3.1-build.6+mc1.20.1 yet_another_config_lib_v3: YetAnotherConfigLib 3.4.4+1.20.1-fabric com_twelvemonkeys_common_common-image: common-image 3.10.0 com_twelvemonkeys_common_common-io: common-io 3.10.0 com_twelvemonkeys_common_common-lang: common-lang 3.10.0 com_twelvemonkeys_imageio_imageio-core: imageio-core 3.10.0 com_twelvemonkeys_imageio_imageio-metadata: imageio-metadata 3.10.0 com_twelvemonkeys_imageio_imageio-webp: imageio-webp 3.10.0 org_quiltmc_parsers_gson: gson 0.2.1 org_quiltmc_parsers_json: json 0.2.1 yigd: You're in Grave Danger 2.0.0-beta.13 fabric-permissions-api-v0: fabric-permissions-api 0.2-SNAPSHOT libgui: LibGui 8.1.1+1.20.1 jankson: Jankson 6.0.0+j1.2.3 libninepatch: LibNinePatch 1.2.0 yungsapi: YUNG's API 1.20-Fabric-4.0.5 yungsbridges: YUNG's Bridges 1.20-Fabric-4.0.3 yungsextras: YUNG's Extras 1.20-Fabric-4.0.3 Loaded Shaderpack: (off) Launched Version: fabric-loader-0.15.11-1.20.1 Backend library: LWJGL version 3.3.1 SNAPSHOT Backend API: NVIDIA GeForce RTX 4090/PCIe/SSE2 GL version 3.2.0 NVIDIA 555.85, NVIDIA Corporation Window size: 1024x768 GL Caps: Using framebuffer using OpenGL 3.2 GL debug messages: Using VBOs: Yes Is Modded: Definitely; Client brand changed to 'fabric' Type: Client (map_client.txt) Graphics mode: fancy Resource Packs: fabric Current Language: en_us CPU: 16x AMD Ryzen 7 5800X 8-Core Processor 
I normally play on Forge, but I gave Fabric a try and ended up getting back handed with this lmao
submitted by Jahcrsde to fabricmc [link] [comments]


2024.06.01 13:42 Jtmitchell4_ (Non-Fasting) test results are terrible. Will someone help me out here?

I am 34, would classify myself as barely overweight 6’1 210 pounds but sheesh these don’t look good. Any help greatly appreciated
submitted by Jtmitchell4_ to Cholesterol [link] [comments]


2024.06.01 13:37 ellaly Abrupt, significant rise in AMH 2.4 -> 7.6

37yo f, normal bmi.
15 Mo ago, I did an online mail away AMH blood test. I was on oral contraceptive. Result was 2.4, per their reference range and my obgyn comment, "normal".
Last week, work up for egg freeze. I had ultrasound with antral follicle count of 34. I got AMH results yesterday at 7.6! I've been off OC for about 11 months. Cycle length 30-32 days. I have no discernable hirsitism, my weight has been stable.
I'm due to see the fertility NP on Monday, and reached out to schedule appt with my obgyn. But I am so puzzled by this, I seek the wisdom of online community.
Most specifically, what would cause this shocking rise? Do I have pcos; does it onset like this?
Thank you for your thoughts. My head is spinning and I appreciate any input.
submitted by ellaly to endocrinology [link] [comments]


2024.06.01 13:36 Ok-Impress-778 Bloodwork advice. Mid range test but very low gonadotropin.

Bloodwork advice. Mid range test but very low gonadotropin.
Advice on bloodwork. Mid range test, very low gonadotropins.
Advice on bloodwork, mid range test, low gonadotropin. 18M
Background information: 18 year old male, healthy lifestyle. Consistent moderate to high intensity exercise. Consistent and healthy diet, 8-10 hours of sleep every night without fail. 6 foot tall 170lbs/77kg 15% bf (ish)
Potential symptoms observed over the past year or two prompted me to test bloods. (ever decreasing libido, no morning wood for the past year plus. Increasing difficulty focusing, low energy etc.).
Finger prick test taken a couple months ago, worrying results prompted me to get bloods done properly through nhs.
Initial finger prick test showed 'borderline low' free and total testosterone, as well as low gonadotropin levels, in particular, very low LH levels. Test taken fasted, 1 hour after waking. Full results: FSH: 2.7 iu/l LH: 1.7 iu/l Prolactin: 146 mlU/l Testosterone: 15.1 nmol/l SHGB: 28.6 nmol/l Free Testosterone: 0.327 nmol/l Oestradiol: 62.1 pmol/l Albumin: 45.2 g/l
I understand these kinds of tests can be unreliable, being one of the reasons that I followed this up with a GP.
Then I have the blood test results from the NHS recieved today, photo attached to this post.
Test taken under same conditions, fasted, 2-ish hours after waking. Previously stated symptoms did not change between taking these two tests, if anything some have been aggravated to a minor degree.
Expectedly, these results varied, in some cases dramatically, from what I'd received before. The main markers that concerned/confused me was that my testosterone levels appear significantly higher than before (great news) and in normal range. However, my gonadotropin levels were even lower this time around. In particular, my LH, standing at 1.4 iu/L.
This leads me to assume I'm experiencing some degree of secondary hypogonadism, which could potentially explain my symptoms over the past year or so. However, the disparity between testosterone levels and gonadotropin levels is what is really confusing me.
If anybody has experienced or even observed similar results, some insight would be helpful as I'm not sure what my next course of action should be. There is limited information (at least that I have been able to find) on this from articles or scientific literature.
If anybody needs any additional info, I'm happy to provide.
Thanks
submitted by Ok-Impress-778 to trt [link] [comments]


2024.06.01 13:34 ButterflyPlenty9838 Pap smear result and HPV test

I have been very scare for the past month.I have gw 3yrs ago and brown discharge between period for 6 months straight ,cervicitis,and 0.3cm cervical polyps,discomfort during sex but not bleed.Google search cervical cancer symptoms are same with mine .So I immediately test pap smear and HR hpv test and today I got the result.I will go to my doctor next week and I can't wait. Since I also check for HPV type test,why there is no result for that but the lab told me it is included.Can you please tell me by checking my result.Thank you so much.
LIQUID BASED CYTOLOGY Procedure Pap smear Source 区 Cervix Clinical Information X Single Gravida . Specimen Adequacy & Adequate X LBC/ Thin Prep Endocervix • Married C Conventional • Vagina 그 LMP ロ Vault Parity • Not Adequate & Satisfactory for evaluations; endocervix cells and/or metaplastic cells present. Method of Analysis INTERPRETATION AND RESULTS & Negative for intrepithelial lesion or malignancy (NILM) • Infection: No specific microorganism present, Normal bacteria flora seen • Inflammation; X Mild • Epithelial abnormality present : No specific change present
LIQUID BASED CYTOLOGY SQUAMOUS CELLS: No specific change present GLANDULAR CELLS: No specific change present MICROSCOPIC FEATURES: The smear shows normal superficial and intermediate squamous cells and few parabasal cells. Endocervical cells are not included. There are a few numbers of neutrophils infiltration. Features of dyskaryosis are not present. CYTOLOGICAL DIAGNOSIS: 1. Negative for intraepithelial lesion or malignancy (NILM) 2. A mild inflammatory cervical smear 3. Regular follow up is suggested
submitted by ButterflyPlenty9838 to HPV [link] [comments]


http://rodzice.org/