Macro wand work wow

WoW Macros: The Place on Reddit to Find the Best Macros

2014.03.03 23:17 Roxorboxorz WoW Macros: The Place on Reddit to Find the Best Macros

This is a subreddit dedicated to macros in WoW. You are welcome to ask questions and post helpful macros.
[link]


2015.05.21 19:38 KarmaDriVe The most irrelevant protoss

****
[link]


2024.06.01 14:29 gozillastail A CALL TO ALL THE O.G.'s O.P's STILL UP IN THIS BI~ *achem* house... yeah - house!

TLDR = Too Long Don't Read It
k so I've been active here since Grush-gate 2023 when people started taking a REAL HARD LOOK at previous UAP footage, evidence.
The NASA scientist panel had been on the previous week. They were trying to end ridicule of any research in the field of UAPs. It was also a sham-filled nothing burger.
THE ONLY REASON WE ENDED UP HERE IS CAUSE WE GOT KICKED OUTTA UFOs ! by the establishment. the regency. the "mods..."
the collective, exhaustive, and morally mandatory reconsideration + reassessment of any and all multimedia, that by simply existing, could induce in the mind of the consumer....questions.
Like questions related to the possibility of the existence of 1) UFO / UAP 2) biologics and / or NHI 3) immaculately efficient technology capable of generating INFINITE ENERGY 4) GOD
Man Wearing Uniform: "Yes - I can acknowledge that the video exists, but I can't comment on it. "
But then, that day, (remember!) on live TV - Three American heros, under oath, looking rather dapper, (not in uniform)
effectively told a panel of people representing their constitutes - the American people -
"you're gonna have to watch all of them - all over again"
decorated heroes. telling us that we had to do it. we were gonna have to go back and watch every single UFO video ever, every alien autopsy film, every russian UFO crash photo.
GruschGate got people talking about these three orbs... again.
The conversation about whether or not these specific videos were "real-or-not" was officially back on the table, and we're still having it right here, right now.
but we were having it before too. remember when....
*~~WAYNE + GARTH DREAM FADE~~*
at the time, the very discourse that this NEW subreddit is now dedicated (relegated?) to, that conversation was entirely too hot for the UFOs sub to handle. and there are... reasons.... we were...
removed....
"You see, Mr.......... Anderssson...."
*Neo looks guiltily away from dossier*
Agent closes dossier with a *!SLAM!*
"You see...Mr. - Anderson -"
AirlinerAbduction2014 u/NewFollwer : "What TF is even OP's point? This is already too long and cringeworthy. To heck with this! I'm heading over to highstramgemess!"
Hey, new guy, stick around for a bit longer. I'm almost done. lol.
The point that I want to make is this -
The points - the "exhbits" - the leveraged "proof" - the very same anomalies pointed out by OP, that CROSS RTF OVER THE UNCANNY VALLEY, OVER AND OVER AGAIN, for generations, (literal GENERATIONS of posters - cycle is about 4 months. One of you OG homies back me up here)
We keep coming back to the very same points, over and over again. The way that the plane it's straightened out and accelerated does it for me. But there are a lot of thing that look really, really, real.
Check out how to check it out in 3D if you haven't yet. It will change you perception of the event.
literally.
tiny rant coming up it's juicy - 12/10 O.G O.P.s would agree.
*arms straight p cracks knuckles palms out*
That flying were-rat from India was a fraud. PERIOD.
AirlinerAbduction2014 u/NewFollower: " hey mom what an Indian flying were-rat?" u/NewFollower's Mom: "Honey I thought I told you to go bed. Now get back upstairs read your Foundation Trilogy."
If you know about the rat, I'm sorry you had to go through that.
And if you don't, but wanna, well... you're gonna have to scroll down to the beginning.
Well... more like the middle of the sub's entire feed. Also sorry you're gonna have to go through that.
Indian flying rat boy did good work muddying the waters for a while. that "orb moves the contrail" hype is the trademarked style. rather lazy IMHO..
"Orb Punches Whole through Cloud" or whatever TF it was called. looked GREAT! But it wasn't real...
Hoaxes look like hoaxes.
The difference here in AirlinerAbduction2014 is that WE ALL HAVE UNRESTRICTED ACCESS to the best possible verison of the footage, so we can easily debunk a contrail or a cloud hole.
Okay gang, - we've arrived. Grand finale -
THIS generation of posters! They carry a sharp sword! It cuts clean! And deep! They read everything - but don't believe everything they read! It's either a very smart or a very dumb tactic.
A message to this generation ~
Stand up for what you believe in Speak your mind but most importantly speak your HEART.
Are the videos real? Did this actually happen?
"you're gonna have to watch all of them - all over again"
new guy here - yeah you - with the shirt and pants on - What does your HEART tell you when you're watching?
Don't they look so real?
Maybe they are.
"you're gonna have to watch all of them - all over again"
"o~oooooh ~ baRa....bara-ko~odA"
"you're going to have to read it all of it all over again"
"he's a Ma~gic man"
*CREDITS FINISH ROLLING*
Good thing your stayed in your seat.
I think the indian flying were-rat is feeding this trash to 4 orbs, and he's just eating it in front of us like David Hasselhoff lying on his side, shirtless, mouthing a quickly disintegrating Wendy's double-bacon cheeseburger over paper plate on his living room floor. Don't Watch The Video.
This generation needs to be the The Hoff's daughter , recording the video of her father, scolding him and shaming him for having let himself get this out of control.
So keep up the good work, kids. No gods, no masters, no managers. Only men.
Keep everyone accountable., all the time. or else all this sub will amount to is nothing more than an empty box filled with useless brown paper wrappers.
* ESTABLISHING SHOT *
The normally dark shadows of a poorly lit parking garage glow grey from the haze of cigarette smoke
*OP takes a long drag off a short cigarette* *direct eye contact with.................................................................... YOU! *lots of eye contact* *even more eye contact*
AirlinerAbduction2014 u/Newfollower : *wow that's a lot of eye contact* hey how did he get into the parking garage? and where is your mom?
*turns around* *shuffles slowly and silently into the shadows*
*cue X-files opening theme song*
aaaaaaand SCENE!
THE TRUTH IS OUT THERE
submitted by gozillastail to AirlinerAbduction2014 [link] [comments]


2024.06.01 14:25 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to computervision [link] [comments]


2024.06.01 14:24 itszyna 25 [F4F] LF: Accountability buddy, preferably fellow plus sized girlies

About me: - trying to back to my 10k steps/day routine - used to lift weights sa gym but realized low impact workouts work best for me - tracks macros so i can help u with that - looking for someone who is more into building sustainable habits than just hitting goals - can go workout with you from time to time if u need a workout buddy around pasig/cainta/marikina area
i have a spreadsheet for tracking my habits. so if you're into that, maybe we can have a shared one para makulit din natin isa't isa if we see one of us slacking off lol
if interested, send me a short self intro (kahit goals lang) + how strict you want tracking to be so like maybe a 3/5 for casual weekly checkins or like a 5/5 with like daily pangungulit hahaha cos im okay with any
submitted by itszyna to PhR4Friends [link] [comments]


2024.06.01 14:24 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to learnmachinelearning [link] [comments]


2024.06.01 14:23 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to deeplearning [link] [comments]


2024.06.01 14:21 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: https://github.com/tensorflow/addons/issues/2807 warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66s 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to tensorflow [link] [comments]


2024.06.01 14:16 theonlybegottensim Karen is looking for a good time and this is what happened... (Part 2)

Karen is looking for a good time and this is what happened... (Part 2) submitted by theonlybegottensim to sims2 [link] [comments]


2024.06.01 14:16 Dapper-Pin128 I F 24 am feeling overwhelmed and depressed in my relationship of 7 years with my bf m 24, what do I do when I feel this way?

What do I do? I feel stuck, I love him but I feel like I'm a worse, sadder version of myself when I'm with him sometimes. I have dealt with some family issues with him and he has been with me through so much (throughout 7 years), I've been so stressed from college and family, he's been my rock. He knows every stupid thing I've done in the past, and for the first 2 years would make me feel bad for my past decisions I made as a lonely mentally and physically desperate teen that was used by boys. All I wanted was to be seen and wanted but I was used as an object since middle school until I met Him at 18. I regret the decisions I made and felt so embarrassed that he knew EVERYTHING. I'm not very sexually driven, but he is and I feel like he guilt trips me into doing things I'm not in the mood for.. but I've been raised as a people pleaser so I'm not sure if I'm just making myself feel like I have to or because he was visually express his disappointment until 75% of the time give into it to make him not sad.
We talked about this before and he has told me I never have to do it of I don't want to, but I can tell bt his facial expressions and body language that he actually doesn't care. I say this as I've seen and noticed how, I mentioned to him how much I read into facial expression, yet since then, I have never seen him so persistent by showing me how sad he is that I don't want to give him pleasure. And the second i say, i’ll do it or start something, he would get so excited and happy. Or am i reading too much into it?
I love our deep conversations about life and how we love to watch and analyze movies that have deeper meanings, but I feel like he doesn't value some of my ideas or opinions, trying to correct me on a thought I had or out do me. Sometimes I feel like he tries to attack my intelligence due to how easy it is and how self conscious I am.
I grew up having an optimistic outlook on life, especially due to trying to keep my family happy and make the most out of the time I had with my dad due to his constant deployments throughout my childhood. There's no time to be sad, we need to cherish and make the most out of the time we have with each other. But since being with Him, I've felt a shadow of darkness on my outlook on life. He grww up with a pessimistic outlook, but he was so much happier when we started dating. When I try to lighten the mood, he somehow dampens the room, creating my tries of positivity into, what's the point of trying. I enjoy seeing the light at the end of the tunnel, I never planned on changing him, but why does he want to change me? I get so excited over the little things, I feel so overjoyed by things like seeing hummingbirds close up to literally anything, but I feel like he makes fun of me for it. I love giving people compliments, from their nails to their stripped pants. It gives me the biggest smile to see their face light up. But why do i feel like I can't be myself around Him? I feel like I'm too much and have to calm down so he doesn't judge me or look at me with a condescending look.
I understand that we need to feel sadness from time to time, but there's something about picking out the little things that gives me thr biggest smile.
I told him how I feel about how certain things make me feel. I'm a emotionally sensitive person and I will cry for making him feel bad. I've never been so anxious in my life, I know college took a lot out of me, but what made it harder was how controlled I felt by Him. I made no friends, I've always had trouble making friends but the people I would find similarities with were with men. Of course. I never had so many similarities with someone before and it was so nice to talk to someone I had stuff in common with. My bf and I don't have many things in common other than our perspectives of the future and our time spent together, but there are those little things like food and music or interests and hobbies.. im always open to His interests and would always show interest in what he likes, but I don't see him trying for me most of the time
But due to my past with guys, my bf doesn't like it when I talk to men, in general. I have never cheated on him and he always tells me I better not, even though I would NEVER CHEAT. I never approach men, or start conversations with them. But when I have to for class or work, I'm scared to tell him. I hate seeing him upset or angry. One time I had to be in a group with a guy, and he was literally me. I did not see ANY romance in our conversations, we were copies of each other from our interests to our childhood experiences. I was so nice to talk to him about our love of history, but I could never see me with him in any way more than that, copies. Does that make any sense? I meant to tell my bf about him but my fear his reaction stopped me. I know I should have, and my fear of conflict is no excuse for lying to him or not telling him about my group partner. My bf found out and he doesn't trust me. He randomly checks my phone and I feel like I deserve it, I do. I led myself here. I blocked the partner after the project was finished and I'm a terrible person for what I did to my Bf and the team member.
We started dating at the end of junior year and I was not planning on going to college with a bf. He followed me and hated the idea of long distance. My dream was to go to a college out of state and so that's what we did, together. I love him, he knows what makes me happy and we, almost, have the same humor. But I didn't imagine how stressful college was going to be with someone who never fully trusted you since the beginning. I don't know how to view this relationship.
This not at all me blaming him for anything. I've been thinking about how different I feel and have felt for years and I'm scared. I'm scared of change and disappointment. I have made my decisions and I have to live with them, I put myself in these situations and I tell myself I control my own life. I've been taking deeper dives in how I function and I'm scared im in a relationship that I won't be happy in. I say all of this but when I look into his eyes, all I see is my baby and his laughs brighten my days, but when I'm away from him, I feel like I can breathe unless some guy sits next to me in a class or talks to me at work. I love talking to people and with the place I work at, I feel alive around my coworkers. I have never felt a romantic interest in a guy but the second I mention him to my bf, he stares at me like I cheated on him.
I've been viewed attractive throughout these past few years, and I when I wear makeup he asks me why do I look this good and who are you trying to impress. No one, NO ONE I'm so TIRED of those words! I'm so sick of them because I do my makeup for my own pleasure, I love winged eyeliner and highlighter, I love how long my eyelashes get with mascara, but I will never wear makeup for the purpose to impress others, unless it's girls that wear winged liner too, I love talking to then about the brand they use and sharing tips and tricks. But we've discussed this so many times that it makes me sick. I understand but I don't understand why he keeps asking me this
We've talked about how he's been feeling more insecure lately due to his weight gain, but I ALWAYS give him reinsurance that I love him and will be by his side through this Rollercoaster we call living.
I'm all over the place. And my head hurts thinking about it all the time.
We don't live with each other but have planned to for years, and once I saved enough, we are, I'm excited and have wanted this for so long. But I like having my own space. I've always wanted my own place, my own kitchen, living room, just a place I control and manage with my things that make me feel brighter and optimistic, but I'm scared He's going to ruin it.
If anyone reads this, wow, I'm sorry. I've never told a soul this because I don't have money for a therapist (but I'll be getting insurance soon so I hope I can find one this year) and I need someone out there to just see and maybe comment on it. I'm so lost. Am I in love? I was, or was I ever in real love. I know I was and I'm. My feelings are so strong, I can't deal with them half of the time. I know I've made mistakes, trust me, I think about them too much to not feel ashamed all the time, but should I feel ashamed, I do. I've never cried so much I will say that. I'm sorry, I keep typing because I don't know what to do!
This was nice to get out. Thank you and goodnight
submitted by Dapper-Pin128 to relationship_advice [link] [comments]


2024.06.01 13:08 Agitated-Feature-963 MA Experience - positive!

Hiya!
I wanted to come in and write a post about my MA experience 2 weeks post pills.
I took my first dose at 5 weeks and 6 days with fear of the pain and sickness I read so many experience. When I placed the next dose of pills vaginally, I was so nervous as I experience some terrible period cramps on a norm and didn’t want this to be like that.
What I can say is, wow is it easier than expected. My best advice is to eat plain food the day before and the day during. No artificial sugars, no greasy food, just plain easy to digest food. I do this on a regular month before I get my period and I feel like it tremendously helps with my period cramps so doing this for my MA felt smart.
It took about 2.5 hours before any bleeding started. Actually - I felt like it was late and was about to seek help before something came out. From here it was like a full river escaped me (the gush is real). I typically have light periods so this was shocking! At the worst of the cramps I would give the pain a 6/10 and I mean this was the WORST. Nothing ibuprofen and paracetamol didn’t help, I took about 800mg of ibuprofen and 500mg of paracetamol about an hour before I started anything.
Heating pad is a MUST. I don’t think I could have done this without. I unfortunately didn’t have my partner with me during this so I was worried about being alone but I had my friend on speed dial in case of emergency, to be honest it was kind of nice to just be dwelling in my own misery.
Cramps ended about after day 2, after what I feel was the passing of the pregnancy, the cramps started to fade with sometimes popping back up and then going. The bleeding began to stop after day 4.
Here I am 2 weeks after and from time to time I will bleed brown and pass a clot but with no pain. It’s like not often enough to continuously wear a pad but enough to be weary. I have one week to go until I can take my pregnancy test again but honestly when they say “symptoms going down like the feeling of feeling pregnant” it’s true. I didn’t have many symptoms to begin with part from sore breasts and slight cramping (and being very very emotional) but I can’t believe how much BETTER I feel! I feel very confident this worked and that everything went smoother than expected.
I write this for anyone else going through this for their first time! I honestly didn’t think I could become pregnant and after years of BC I stopped it and followed the flo app for my ovulation days. TIP - don’t do this 😂. I’m now back on the pill and feel much better to just have this out of the way.
The whole process was legit easier than a monthly period.
Good luck to those having to go through this! You can do it!
submitted by Agitated-Feature-963 to abortion [link] [comments]


2024.06.01 13:04 Repulsive_Solid8503 Forge macro editable body

Hi everyone,
I'm new to Forge app development and need some help. I'm trying to create a macro with a body that is editable during the edit stage, similar to how the Expand macro works. So far, I've only managed to edit a macro via the config (the little pencil that appears in the edit mode), but I want to make it editable directly within the text body, without the need to click on the pencil.
Any guidance or examples would be greatly appreciated! Example code for the existing "expand" macro is all I need.
Thanks!
submitted by Repulsive_Solid8503 to atlassian [link] [comments]


2024.06.01 13:02 Flyingvosch Which fast lens(es) should I start with?

Hi everyone! This is my first post on Reddit, trying to be specific so I hope it’s not too long...
I have inherited a Nikon D3300 + Tamron 18-200mm f/3.5-6.3 VC, and after some months of use I’m reaching limits in low-light performance. I regularly shoot family/friends and religious events, often in buildings with bad-mediocre lighting I can’t do much about. Handheld, often can’t use flash. So typically, max aperture, shutter at 1/60 (or 1/30 in extreme cases) and auto ISO regularly hitting the defined limit (3200, higher gets really noisy)... PP does help (I sometimes do it), but photography is just a hobby or a help/service, so I would like to start with better low-light performance at the capture level.
Therefore, I am looking at faster lens(es) in the short/medium range, like up to 100mm. I’m aiming for 1.8 or even 1.4 aperture, but I’m a bit confused about which lenses would be most relevant to invest in, since I would like to keep it below let’s say 600 € (I’m in France, and yes I will buy used).
  1. Range: I know fast zooms are heavy and expensive, but I don’t think I will be satisfied with only primes. On the other hand, looks like no zoom (at a reasonable price) can beat the 1.8/1.4 aperture of primes.
  2. If I take an APS-C lens (which makes sense right now), I’m struggling to get rid of the fear that IF I EVER get a full-frame it will become rather useless.
  3. If I get a (crop?) mirrorless one day, would a DSLR crop lens work on it? Guess it depends on the brand and other details...
I believe the Sigma 24-70mm F2.8 EX DG HSM would be a great zoom. However, on APS-C the range would start at 36mm (did I get this number right?), and I know I would miss out on the wide angle. I could add the Sigma 18-35mm F1.8 DC HSM Art, but given its price (and weight) I wonder if it wouldn’t be smarter (and cheaper) to get one or two f/1.4 or 1.8 primes in that range.
There’s also the cheaper Sigma 17-50mm f/2.8 EX DC HSM, but it would overlap with the 24-70 range and the aperture is the same.
What do you guys think? What approach would you suggest? And, more importantly, what specific lenses (primes or zooms) can you point at? Brands, models and their versions, etc. I prefer to spend a bit more if it’s worth it btw. Macro (small flowers) is also a + but not decisive.
TL;DR: What combination of fast lenses (primes or zooms up to ~ 100mm) would you suggest for good low-light performance on a Nikon D3300, in a budget of ~ 600€? First time purchasing gear.
submitted by Flyingvosch to AskPhotography [link] [comments]


2024.06.01 13:02 No-Poet-8302 Should I (29/M) have told my best friend (27/M) that I was insecure about my ex's (20/F) affection for him?

time check: this was literally about a year ago and its always bothered me. so can i vent? im gonna vent just because ive never told anyone this, not even my best friend because he's involved and so...i never brought it up to him cause i didnt want him to feel bad.
so we're coworkers at a retail store mind you. we're over a month in our relationship and me, my ex and other coworkers are all at work. we're pretty close by this point and then my best friend/coworker comes to work and im beside my ex behind a large glass table. and when she sees him, she is the ONLY one of our coworkers who excitedly RUNS around the table and hugs him....
how do you think i feel at this moment? so to add a little context: my ex at this point and never after during our relationship ever showed this kind of excitement when seeing me. my best friend had last worked with her like what, less than a week. also never showed this type of excitement to any other coworker. So?
also, my friend is a model/actor. handsome. and he had gone to new york for a trip. i can understand if this had been the first time my ex had seen him in like 2 weeks after his trip to new york. but it wasnt, they had literally just worked together in less than a week.
and then she playfully punches him, grabbing his arm and who knows what else. i was kinda pale faced, not wanting to put too much attention as my other female friend/coworker i know was staring at me. she's like me, and feels insecure about stuff so i know she knows i must have felt something.
also, im not the touchy type at all. im a very shy guy, i dont initiate hugs like this at all with anyone except with my partneex at the time. and so what confused me was that i had told my ex this - that im not the touchy type after my ex said she struggles to reciprocate affection back (to me). when she told me this, i was like oh wow. my partner struggles to show physical affection. didnt really question what she meant by it specifically but still felt it was fair enough. and then i see this.
theres much more to our story, and reasons why I felt insecure. at one point, my ex told me that because she had seen me too much lately, that the excitement to see me had died down. thus, when i would see her at work she'd just be like 'oh hey'. sometimes, wouldnt even acknowledge me. and sometimes, i had to ASK for hugs cause my ex wouldnt initiate it. would just see me at work and continue on. but hugs everyone else like my friend - running around the table all in front of me and our friends.
i messaged my ex immedielty after this, during my break at work. never ever saying she should stop anything. but just that im confused and wished she showed me the same excitement. but called me insecure and the excitement to see me had died down. didnt make me feel too good :/
submitted by No-Poet-8302 to BreakUps [link] [comments]


2024.06.01 12:49 TFVooDoo A Note About Strength Training

Given the recent discussion of Shut Up And Ruck’s strength programming, I thought it might be appropriate to address a few lingering comments.
First, we’re not immune to criticism. It is perfectly reasonable to criticize whenever and whoever you want, even me. Clearly, the anonymity of the internet provides ample license to do so. I’m not infallible and I make at least one mistake every fiscal year. I get downvoted all the time and I recognize that many things that I say are taken as gospel based on my years of providing accurate information. I don’t take this leniency lightly. I’ve earned this gift and I don’t look gift horses in the mouth. I certainly don’t shy away from criticizing others, but I always seek to do so from a position of best intentions of the outcome. But if you think that it’s appropriate to draw conclusions like “He definitely doesn’t know what he’s talking about”, “It violates basic principles”, or my favorite “It looks like he stole this from X, Y, or Z” and you’re basing that on one tiny screenshot of one sample day of one singular domain absent of context of the entirety of the programming then you must be special. I wish I had that sort of clairvoyance.
Second, our programming is not a mistake. Is it aggressive? Absolutely. Is it wrong? Absolutely not. It is deliberate and intentional. A few points to consider:
-The higher percentages and rep ranges occur at the end of each cycle. You don’t start off at the high end, you finish there. The passage cited is 9 weeks post 1RM testing. At a minimum the higher % come 5 weeks after testing. You get stronger and the programming reflects that.
-Just because you’ve never done anything like this doesn’t mean much. We follow the evidence, and the literature clearly indicates that our recommendations are appropriate. Aggressive, but appropriate. Here are 6 sources, including some meta-analyses that bring the body of knowledge to several hundred; there are many more.
Source 1
Source 2
Source 3
Source 4
Source 5
Source 6
Your experience not withstanding, our programming is entirely valid. This is especially true given the other variables. 1) we prioritize intensity and we manifest that through heavy weights 2) you only lift each exercise 2 times in every 5 day cycle - plenty of time for macro recovery 3) you are resting up to 4 minutes between sets - plenty of time for micro recovery 4) you are only doing 3 lifts in a day and only one for that domain - you aren’t doing 3 sets of barbell bench, then 3 sets of incline, then 3 sets of decline, then some cable cross-overs, then some dumbbell flys, then finishing with some drop sets on the Smith machine. 1 exercise, at maximal intensity. No need to pace yourself. 5) we are seeking to balance strength and endurance. It’s impossible to fully address both simultaneously. There will inevitably be friction. 6) we are seeking to challenge you, not accommodate you. 7) we emphasize self-reflection, data analysis, and agency. If you are struggling to meet the listed criteria then we encourage you to program accordingly. It’s foundational to our approach.
But allow me to let you in on a little secret. Even though we cite no small amount of literature, you can find lots of literature that argues against our programming. In fact, there is so much ‘literature’ out there that you can find supporting information for damn near everything and anything. So, back to my first point, you are welcome to criticize. But you should at least provide some counter-evidence beyond “in my experience”. In the Taxonomy of Information, anecdotal testimony is the least rigorous. We have presented our arguments, you are invited to present yours. Or be a little more graceful in your criticism.
We are well aware of Prilepin’s optimal reps (for powerlifting), and the NASM 5 Phase Optimum Performance Training Model (which we follow) and the NSCA Performance Pyramid (which we follow). We don’t disagree that they are to be well considered. We did a full and complete survey of the information environment. But we stated in our introduction and made available for free our philosophy…we have no interest in preserving the credentialed protectorate of the fitness industry. SFAS is different, so shall the programming be.
Third, we didn’t “steal” another program and stack it on top of our own stuff. That’s not how this works. If you survey all of the programs and methodology out there, you will find a ton of overlap. If you follow established principles and seek consensus, then you end up looking a lot like the other stuff. Did we look at other programs? Yes, dozens of them. Did we steal them? No. The fact that we favor a more intense program that most programs don’t should make this argument moot. This is a serious accusation and should be reserved for the most egregious circumstances. You might not have experience with this type of programming, you might not be familiar with recent literature, and you are only seeing a very minuscule event absent of any of the other programming and ancillary elements.
Fourth, and finally, I want to address the unhinged discussion of cost. We’re particularly sensitive to this topic because we know that our target population skews younger and likely less affluent, so cost matters. And I don’t like calling guys out necessarily, but u/Certain-Exam-2577 and u/Potential_Presence67 ? You two can go fuck yourselves. You anonymous peices of shit decided from your castles on top of Mount Holy that we are looking for a “money grab”? I could have charged hundreds, I could put all of my content behind a paywall, and I could simply pump and dump and walk away to stack cash. But that’s not the case.
What do you two fucking genius economists think would be appropriate for 8 months of daily programming for strength, conditioning, rucking, mental prep, mobility, skills, recovery and much more? We charge 60 dollars. Let’s take a very small survey the prep environment and see where we stack up:
Evoke - 3 months, requires additional programming prerequisites, $65
Performance First - 3 months, $90
18A Fitness - 4 months, $179
Gritty Soldier - 3 months, $30
Mountain Tactical - 12 months, $329
Blue/Green Training - 11 weeks, $129
We’re looking pretty competitive given these numbers. And these are the better programs. We mostly like them (and others) and we have tremendous respect for their creators and coaches. We don’t think they are as good as ours, especially our ruck programming, but they’re in the ballpark. Many guys in this sub have used them and speak highly of them. There are also near endless shit programs out there. AI generated, generic, point-of-sale trash with slick marketing and zero support.
We are a complete program that covers every single domain, and we have well established our expertise for SFAS. But we don’t rely on reputation, we deliver. We research, analyze, synthesize, and present the most comprehensive program out there. For just 60 bucks. Hell, you’ll spend over half that on a blank journal…we’ve recommended this excellent journal many times. But that’s just a cool journal. Zero programming. So we think we’re not “grabbing” too much.
Our resident pricks go on to say that RUSU wasn’t worth $50. Good thing we only charge $40. And perhaps you’d prefer the 15+ year old, lack-luster competition? They’re in the same price range. They even take a cheap shot at our Muster events as just a ‘wAlK iN tHe wOoDs tHaT yOu cOuLd do for FrEe’ or ‘info you could probably find online’. Our “competition” is $750 and one of the programs isn’t even taught by a military guy, much less a Green Beret. You two retarded laureates haven’t even attended an event, so your opinion is irrelevant.
And I should put a pin in all of this money grab, predatory, grifter talk by reminding them that this is all voluntary. You don’t have to spend a single dime if you don’t want to. Lots of guys don’t do anything extra and they get Selected all the time. But if a guy wants to be compensated for his hard work and another guy wants to allocate the cost of a night out drinking, then maybe your keen criticism could be stymied a bit. I offer plenty of free advice and commentary every day. I note that neither of you provide anything of value.
So, that’s my assessment of the situation. You don’t have to be a part of the conversation, but I thought that I should let you know how I see it. I endlessly tell you about the importance of foot care, so it’s only fair that I weigh in on this important topic. I should note that there was also some very reasoned comments and lots of guys understanding the intent of the programming AND of the program. And the OP reached out via DM and we had a very reasonable and productive discussion. He gets it. And the number of guys commenting is <1% of the number of guys reading the actual full program. I like that guys are passionate about this stuff. If you get 10 Green Berets in a room you’ll get 11 different opinions on damn near every topic. You know what they say about opinions…
submitted by TFVooDoo to greenberets [link] [comments]


2024.06.01 12:23 StevenColemanFit I wonder what’s it’s like to be inside the head of someone who has consistently called this war a genocide?

Do you think now that Israel has offered a ceasefire deal that they will acknowledge that it’s not and never was a genocide?
Despite the mountain of evidence that Israel worked hard to avoid civilian casualties do you think this ceasefire offer will be the straw that breaks the camel’s back and these people will think ‘wow I was wrong, I was lead to believe that Israel is blood thirsty’?
How will their minds cope
submitted by StevenColemanFit to samharris [link] [comments]


2024.06.01 12:17 Gloomy_March_7677 Started passing out * update* (and thank you)

a year ago i originally posted the below
“i’ve reached a point in my body where i’ve recently started passing out some days and i know it’s bad i’m trying to get better but today i passed out and hit my head and i think it’s kinda been a wake up call, no one around me knows how i’m struggling and i live alone, do i reach out and tell someone about these passing out fits? or do i just keep ignoring them and carrying on”
to which i received a dozen comments telling me to go get help immediately, at this time my bmi was seriously low and still wanted to get lower (it’s true when they say you’ll never stop at your dream weight) the comments i received on this post were harsh but the shake my deeply malnourished brain needed.
i wanted to come back and say thank you to everyone who commented and this subreddit! when something in your brain is telling you to ask for help it’s the part of you that wants to live and dream!
Since this post i have recovered to a healthy body weight, travelled the world, lived in new york and recently started law school! all things i never thought i could do, i wanted to be valued for my body and beauty because i didn’t believe i had anything else to offer. I put in the extremely hard work (and it was extremely hard) but my mind is at a place now where i can look back at these things and not wish to be the person i was when i posted that. When i look back i see a person who was so close to death and throwing away my chance smile and laugh for what? someone to look at my sucken eyes, rib cage and thigh gap and think “wow she’s pretty” because trust me that’s not what their thinking.
if anyone wants to reach out for help, it’ll be the best think you ever do, don’t throw away your smile and laugh for looking good in a photo because that’s not what you’ll remember about when you took that photo.
Thank you everyone again.
submitted by Gloomy_March_7677 to EatingDisorders [link] [comments]


2024.06.01 12:14 Similar_Set_6582 My account was banned from TV Tropes. Can someone do me a favor?

Just copy this, paste this into a sandbox, then make a thread in the Trope Repair Shop linking to it. The thread shall be titled "Needs Help: Inexplicable Language Fluency".
A wick check for InexplicableLanguageFluency.
'''Why?''': InexplicableLanguageFluency seems to lump in examples of characters demonstrating fluency in a language they've never learned with examples of characters merely understanding a language they never learned. For one, it doesn't make sense for the latter examples to be there since the title has the word "fluency" in it. Secondly, many of the latter examples can be moved to BilingualDialogue or AmplifiedAnimalAptitude.
'''Wicks checked''': 50/72
'''Wick totals''':
* 21/50 wicks, or 11.5%, were clear examples of a character demonstrating fluency in a language they never learned. * 24/50 wicks, or 12% were examples of a character understanding a language without demonstrating fluency. * 4/50 wicks, or 2%, were written in a way that made it unclear whether they were being used correctly. * 1/50 wicks, or 1.5%, was an aversion.
[[folder:Characters demonstrating fluency in a language they never learned (21/50)]] * Characters.AtlantisTheLostEmpire: The Atlanteans are able to fluently communicate with the explorers from the surface in English and French. This is handwaved as their own language being a precursor to all other languages. * Characters.YaBoyKongming: ** Kongming finds himself immediately fluent in Japanese despite having never learned the language, and the Japan of his era was still a dwarf country that mostly only sent envoys to Shu's rival kingdom of Wei. Eiko points out the absurdity of how an ancient Chinese figure can suddenly speak Japanese, and Kongming himself is at a loss of how he can do this. * Characters.YellowjacketsCrashSurvivors: [[spoiler:She, and by extension others in the Antler Queen cult she forms, randomly speak French on occasion, which Lottie at least is ''not'' fluent in, indicating mental instability.]] '''If this is supposed to be an inversion, it's correct, but it should be rewritten to mention that it's an inversion.''' * ComicBook.Asterix: Gauls can speak to most foreign peoples without any difficulty. Either everyone can speak Gaulish, or the Gauls can speak other languages, but either way is downright impossible. * ComicBook.UltimateFantasticFour: ** Downplayed. Namor demonstrates this trope after spending nine thousand years of being sealed inside a sarcophagus prison. Being from an ancient civilization of Atlantis that predates the modern English language, he initially has no idea what his new captors are saying when they find him. However, thanks to his psionic powers he proves to be a frighteningly fast learner and is able to learn English within an hour to understand modern American English and casually deliver threats and demands. ** Happens again when the Four meets the Silver Surfer who is able to understand English perfectly the instant he arrives to Earth for the first time. He justifies this by stating can decipher electromagnetic signals in the air to learn alien tongues in seconds. * Friends.TropesFToJ: Zig-zagged. "The One Where Ross Can't Flirt" has her speaking with Joey's Italian grandmother, with Phoebe seeming as surprised about this as Joey. But in an earlier episode, Rachel's former lover Paulo calls her "bellissima" and she doesn't understand. ->'''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Film.ExorcistTheBeginning: In-universe, the main characters note Cheche's sudden fluency in English after being brought to the hospital. Of course this is because it is Pazuzu speaking through him and not actually Cheche. '''This is DiscussedTrope, not InUniverse'''. * Film.{{Phenomenon}}: Downplayed. After a UFO encounter, George gains a form of superintelligence where he can learn complicated tasks and subjects quickly, to the point where he's able to fluently speak Portuguese after reading an English-to-Portuguese dictionary for twenty minutes. This was foreshadowed in an earlier scene where after usually fumbling to speak Spanish, he's suddenly able to speak it perfectly, to the point that his Hispanic coworker half-seriously declares that it's better than his. * Literature.ForgottenRuin: When Talker and Vandahar speak during the final stretch to the Hidden Cave, Vandahar speaks in what we would recognize as modern German, which of course makes communicating with him very easy for Talker but raises a few questions. * Literature.InCryptid: Of the GeneticMemory version. When Candice the Dragon Princess comes up against the LizardFolk wandering about the New York sewers, she's able to make them halt using a guttural language. She reveals to Verity that it's a inborn language all Princesses know and something the "Servitors" are meant to respond to. She's also never used it before as Servitors can only be created from a [[spoiler:living male dragon]]. * Literature.SeasonalFears: Aven is essentially in a coma all her life but, thanks to the alchemical knowledge tinctures of her father', she speaks perfect English even if there are some issues with definitions. * Literature.ShiversMDSpenser: Bubbie moves to France with her family and meets a new French kid, Jean-Luc, and can somehow converse with him without any language issues; at the end of the story she can hold a long conversation with a stranger in French, despite spending less than a week in the country. Previous books with an overseas setting will at least attempt justifying this trope (for instance, ''Terror on Troll Mountain'' has the protagonist's Italian family choosing to speak in English because they're anticipating American guests) but this one handwaves the issue until it's a borderline plot hole. * Recap.AsterixAndTheGoths: Getafix is able to speak fluent Gothic, despite being a Gaulish druid who has no logical reason to know it (especially considering how antagonistic the Goths are in this story). * Recap.FriendsS5E19TheOneWhereRossCantFlirt: Phoebe speaks with Joey's Italian grandmother, with [[InexplicableLanguageFluency Phoebe seeming as surprised about this as Joey]]. '''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Recap.TalesFromTheDarksideS4E6TheGraveRobber: Aileen is surprised to hear Tapok speaking English when he wakes up, the mummy giving an offhand mention that he can speak the language of anyone who violates the tomb. * Roleplay.WanyaKingdomVSAwoofyUnity: Fragment has been shown to speak any language even if he has never heard it before. * Series.{{Llanargollen}}: In "Dirgelwch y Llyfr Coll", Prys and Barti speak to each other in Swahili. It seems incredibly unlikely that Prys would be fluent in that language, and Barti only speaks in unintelligible grunts with individual words thrown in. The other characters barely react to this. * VideoGame.EternalReturnBlackSurvival: The test subjects are able to work together with each other in spite of how one would presume that the vast majority of them would be completely unable to verbally communicate with most of the others. {{Averted}} by some of the achievement lore entries, which include statements observed from test subjects that some of which have a note underneath which states the words were translated by AI and will be reviewed and localized further in the future. '''Since this seems to be an implied example rather than a straight example, and it seems to be ambiguous InUniverse whether it's this or BilingualDialogue, I'll leave it here.''' * VideoGame.MegaManBattleNetwork2?: [=MegaMan=]'s translation system allows Lan to speak flawless Netopian after they arrive in Netopia. * WesternAnimation.{{Pocahontas}}: English explorer John Smith meets the alluring Pocahontas of the Powhatan tribe. Smith doesn't know her language, nor she his. That is, until Pocahontas takes a breath of magic wind. Suddenly, her English is better than his. [[/folder]]
[[folder:Aversions (1/50)]] * DarthWiki.{{Aquilaverse}}: ''Averted'' Not inexplicable, since her spell "Xenographus" let's her read any language she encounters for a limited duration. [[/folder]]
[[folder:Unclear (4/50)]] * Characters.FrightKrewe: His power makes him this. He has the ability to communicate with anyone and understand anything they say, whether its a [[InexplicableLanguageFluency person]], an [[SpeaksFluentAnimal animal]], or an [[NatureSpirit 'elemental entity']]. '''The way the example is written, it's not clear whether it's this or BilingualDialogue.''' * Recap.CreepshowS1E3BadWolfDown: Doc is shown to speak French and understands the language quite fluently, allowing him to translate the explanation of the werewolf woman's plight. '''You can't "understand a language fluently". Does he ''speak'' it fluently, or at least speak it better than he should realistically be able to?''' * Series.AlloAllo: Frenchmen, Germans, and Italians can talk to each other. This is never explained. '''In their own languages, or...?''' * WesternAnimation.SpaceChimps: The [[SpeaksFluentAnimal aliens]] and [[InexplicablySpeaksFluentAlien chimps]] understand each other, despite that the chimps have just arrived on their planet. '''Does one of them speak the other's language, or do they just understand each other's languages?''' [[/folder]]
!!!'''Characters being able to understand a language without demonstrating fluency (24/50) 12%''' [[folder:Amplified Animal Aptitude (7/50) '''3.5%''']] * Literature.MrsFrisbyAndTheRatsOfNIMH: The animals can understand humans and each other, but can't talk to humans. This leads to a bit of FridgeLogic as to where exactly all these animals picked up English (justified for the rats, as they were taught it in the lab, but not for any of the others), and whether they can understand other human languages automatically as well. * Recap.TheSimpsonsS6E6TreehouseOfHorrorV: When Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." * TheSimpsons.TropesIToM: ** In "Treehouse of Horror V", when Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." '''Just checked and the sloth doesn’t say “I don’t know”, it just grunts with an intonation that indicates that that’s what it was saying. So, not fluent in English.''' * Tarzan.TropesGToZ ** Played straight when Jane talks to a baby baboon and he understands her. * TotalDrama.TropesHToP: Some wild animals--especially squirrels--can understand humans. * WesternAnimation.FindingNemo: Dory understands English. The fish in the tank understand English because they've been hearing humans all their life, but there's no reason for Dory to understand English. [[spoiler:Subverted in that ''WesternAnimation/FindingDory'' reveals she was raised in a marine life institute in California.]] '''Since it says “understand” and not “speak” I can only assume this is a case of TranslationConvention and AnimalTalk, which means they aren’t actually fluent in English.''' * WesternAnimation.SpiritStallionOfTheCimarron: Spirit, despite not having had any human contact before his capture, understands what the human characters are saying. [[/folder]]
[[folder:Bilingual Dialogue (11/50)]] * Characters.MashupWeekMegamixCompetitors: Despite the fact that all 3 members speak different languages (English for Scoop, French for Eric and Italian for Baba), they can be understood by each other and by others easily. * Film.NoTimeToDie: James Bond's 5-year-old daughter, Mathilde, understands English even though her mother Madelaine speaks to her in French and all the media she consumes is in French. * Film.Titanic1997: Jack and the Swedish men understand each other well enough that Olaf is able to bet his tickets for all Jack and Fabrizio's money. * FinalFantasyXIV.TropesGToI?: ** In ''Heavensward'', the Warrior and Alphinaud are startled by how they can understand the great wyrm Hraesvelgr, who only speaks the language of dragons. They learn that while other races may not speak the dragon's language, the inherent power and magic in a dragon's song essentially beams meaning directly into the hearer's mind. ** The Scions are shocked in ''Shadowbringers'' when they meet [[spoiler:Emet-Selch's recreation of the lost city of Amaurot, the home of the BenevolentPrecursor race. Although the ancient version of mankind spoke only in a StarfishLanguage of droning tones, the Scions born millennia after the world was sundered find that they're able to discern meaning from it.]] * Literature.CookingWithWildGame: Lampshaded. One of the few supernatural parts of the setting is that [[TrappedInAnotherWorld Asuta]] can understand its inhabitants' language (almost) perfectly. Nobody knows why, nor do they ever find out; it's essentially one of the AcceptableBreaksFromReality needed to get the plot going. * Literature.{{Fablehaven}}: After receiving magic from fairies, Kendra gains the ability to interpret their language intuitively despite never actually learning it. * Manga.MiriyaAndMarie: Aside of occasional GratuitousFrench, Miriya can understand all French and animal characters and vice versa. Blackey also tries to charge Miriya for massage in yen despite being French. * VideoGame.KnightsOfTheOldRepublic: Yes, your character understands plenty of languages, but there isn't anyone who ''should'' understand the language of the AbusivePrecursors. [[spoiler: Subverted. Turns out you'd been there before and learned the language, but just forgotten the whole thing between Malak's backstab and whatever the Jedi did to "rebuild" you]]. '''When moving this to BilingualDialogue, I would rewrite it as “Your character understands plenty of languages, but can only speak one.''' * WebOriginal.MashupWeek: In both tournaments, Flat Eric speaks French, but everyone can understand him perfectly. Same with Gigi, who speaks Italian. * WesternAnimation.JosieAndThePussycats: ''[[RecycledWithAGimmick In Outer Space]]'' has Melody encounter an alien creature that looks like cotton candy with eyes and flippers. The critter speaks in a series of bleeps, and Melody names him Bleep. [[InexplicablySpeaksFluentAlien She can also understand him perfectly, and acts as translator for everyone else]]. Also, Bleep seems to understand English just fine. '''I also think InexplicablySpeaksFluentAlien should be limited to people actually speaking Alien instead of just understanding it, but that’s for another time.''' [[/folder]]
[[folder:Other (6/50)]] * Characters.InfinityTrainSeekerOfCrocusMultiverse: Upon looking at some files from the old phone Oscar picked up, he's somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Fanfic.InfinityTrainSeekerOfCrocus: Kaito is somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Characters.YaBoyKongming: ** Kongming also demonstrates an ability to understand English, as he is able to tell that the English song Eiko sings to him is a love song, with enough comprehension to be moved by a vision of his old comrades due to the lyrics referencing being still alive and alone. Later, upon meeting Tsuyoshi Kondo after Eiko's performance at the Yoyogi Art Festival, Kongming easily translates his GratuitousEnglish for a confused Kobayashi, and he's even become friends with a foreigner in Roppongi, who speaks exclusively in English. He's capable of some GratuitousEnglish himself, to boot. * Recap/CreepshowS2E2PublicTelevisionOfTheDead: Goodman reveals to Ted, out of nowhere, that he can read Sumerian, allowing him to translate the Necronomicon's incantations and let Evil roam free once again. * Series.{{Carnivale}}: During the fireball show, Samson secretly passes an old Crusader fob up to the stage during Lodz's psychometry act. When Lodz touches it, he's struck by visions of a holy war and begins chanting ''"In hoc signo vinces!"'' Ben, an uneducated farmboy, surprises Samson by translating even though he doesn't even recognize the language is Latin: '''Ben Hawkins:''' By this sign we conquer... by this sign we conquer... * WesternAnimation.SpaceChimps2ZartogStrikesBack: Zartog understands the humans, despite having just arrived on Earth. He even understands Spanish, Hindi, and Xhosa upon hearing them for the first time, making the humans' inability to understand him even more hilarious. [[/folder]]
submitted by Similar_Set_6582 to tvtropes [link] [comments]


2024.06.01 12:12 Similar_Set_6582 My account was banned from TV Tropes. Can someone do me a favor?

Just copy this, paste this into a sandbox, then make a thread in the Trope Repair Shop linking to it. The thread shall be titled "Needs Help: Inexplicable Language Fluency".
A wick check for InexplicableLanguageFluency.
'''Why?''': InexplicableLanguageFluency seems to lump in examples of characters demonstrating fluency in a language they've never learned with examples of characters merely understanding a language they never learned. For one, it doesn't make sense for the latter examples to be there since the title has the word "fluency" in it. Secondly, many of the latter examples can be moved to BilingualDialogue or AmplifiedAnimalAptitude.
'''Wicks checked''': 50/72
'''Wick totals''':
* 21/50 wicks, or 11.5%, were clear examples of a character demonstrating fluency in a language they never learned. * 24/50 wicks, or 12% were examples of a character understanding a language without demonstrating fluency. * 4/50 wicks, or 2%, were written in a way that made it unclear whether they were being used correctly. * 1/50 wicks, or 1.5%, was an aversion.
[[folder:Characters demonstrating fluency in a language they never learned (21/50)]] * Characters.AtlantisTheLostEmpire: The Atlanteans are able to fluently communicate with the explorers from the surface in English and French. This is handwaved as their own language being a precursor to all other languages. * Characters.YaBoyKongming: ** Kongming finds himself immediately fluent in Japanese despite having never learned the language, and the Japan of his era was still a dwarf country that mostly only sent envoys to Shu's rival kingdom of Wei. Eiko points out the absurdity of how an ancient Chinese figure can suddenly speak Japanese, and Kongming himself is at a loss of how he can do this. * Characters.YellowjacketsCrashSurvivors: [[spoiler:She, and by extension others in the Antler Queen cult she forms, randomly speak French on occasion, which Lottie at least is ''not'' fluent in, indicating mental instability.]] '''If this is supposed to be an inversion, it's correct, but it should be rewritten to mention that it's an inversion.''' * ComicBook.Asterix: Gauls can speak to most foreign peoples without any difficulty. Either everyone can speak Gaulish, or the Gauls can speak other languages, but either way is downright impossible. * ComicBook.UltimateFantasticFour: ** Downplayed. Namor demonstrates this trope after spending nine thousand years of being sealed inside a sarcophagus prison. Being from an ancient civilization of Atlantis that predates the modern English language, he initially has no idea what his new captors are saying when they find him. However, thanks to his psionic powers he proves to be a frighteningly fast learner and is able to learn English within an hour to understand modern American English and casually deliver threats and demands. ** Happens again when the Four meets the Silver Surfer who is able to understand English perfectly the instant he arrives to Earth for the first time. He justifies this by stating can decipher electromagnetic signals in the air to learn alien tongues in seconds. * Friends.TropesFToJ: Zig-zagged. "The One Where Ross Can't Flirt" has her speaking with Joey's Italian grandmother, with Phoebe seeming as surprised about this as Joey. But in an earlier episode, Rachel's former lover Paulo calls her "bellissima" and she doesn't understand. ->'''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Film.ExorcistTheBeginning: In-universe, the main characters note Cheche's sudden fluency in English after being brought to the hospital. Of course this is because it is Pazuzu speaking through him and not actually Cheche. '''This is DiscussedTrope, not InUniverse'''. * Film.{{Phenomenon}}: Downplayed. After a UFO encounter, George gains a form of superintelligence where he can learn complicated tasks and subjects quickly, to the point where he's able to fluently speak Portuguese after reading an English-to-Portuguese dictionary for twenty minutes. This was foreshadowed in an earlier scene where after usually fumbling to speak Spanish, he's suddenly able to speak it perfectly, to the point that his Hispanic coworker half-seriously declares that it's better than his. * Literature.ForgottenRuin: When Talker and Vandahar speak during the final stretch to the Hidden Cave, Vandahar speaks in what we would recognize as modern German, which of course makes communicating with him very easy for Talker but raises a few questions. * Literature.InCryptid: Of the GeneticMemory version. When Candice the Dragon Princess comes up against the LizardFolk wandering about the New York sewers, she's able to make them halt using a guttural language. She reveals to Verity that it's a inborn language all Princesses know and something the "Servitors" are meant to respond to. She's also never used it before as Servitors can only be created from a [[spoiler:living male dragon]]. * Literature.SeasonalFears: Aven is essentially in a coma all her life but, thanks to the alchemical knowledge tinctures of her father', she speaks perfect English even if there are some issues with definitions. * Literature.ShiversMDSpenser: Bubbie moves to France with her family and meets a new French kid, Jean-Luc, and can somehow converse with him without any language issues; at the end of the story she can hold a long conversation with a stranger in French, despite spending less than a week in the country. Previous books with an overseas setting will at least attempt justifying this trope (for instance, ''Terror on Troll Mountain'' has the protagonist's Italian family choosing to speak in English because they're anticipating American guests) but this one handwaves the issue until it's a borderline plot hole. * Recap.AsterixAndTheGoths: Getafix is able to speak fluent Gothic, despite being a Gaulish druid who has no logical reason to know it (especially considering how antagonistic the Goths are in this story). * Recap.FriendsS5E19TheOneWhereRossCantFlirt: Phoebe speaks with Joey's Italian grandmother, with [[InexplicableLanguageFluency Phoebe seeming as surprised about this as Joey]]. '''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Recap.TalesFromTheDarksideS4E6TheGraveRobber: Aileen is surprised to hear Tapok speaking English when he wakes up, the mummy giving an offhand mention that he can speak the language of anyone who violates the tomb. * Roleplay.WanyaKingdomVSAwoofyUnity: Fragment has been shown to speak any language even if he has never heard it before. * Series.{{Llanargollen}}: In "Dirgelwch y Llyfr Coll", Prys and Barti speak to each other in Swahili. It seems incredibly unlikely that Prys would be fluent in that language, and Barti only speaks in unintelligible grunts with individual words thrown in. The other characters barely react to this. * VideoGame.EternalReturnBlackSurvival: The test subjects are able to work together with each other in spite of how one would presume that the vast majority of them would be completely unable to verbally communicate with most of the others. {{Averted}} by some of the achievement lore entries, which include statements observed from test subjects that some of which have a note underneath which states the words were translated by AI and will be reviewed and localized further in the future. '''Since this seems to be an implied example rather than a straight example, and it seems to be ambiguous InUniverse whether it's this or BilingualDialogue, I'll leave it here.''' * VideoGame.MegaManBattleNetwork2?: [=MegaMan=]'s translation system allows Lan to speak flawless Netopian after they arrive in Netopia. * WesternAnimation.{{Pocahontas}}: English explorer John Smith meets the alluring Pocahontas of the Powhatan tribe. Smith doesn't know her language, nor she his. That is, until Pocahontas takes a breath of magic wind. Suddenly, her English is better than his. [[/folder]]
[[folder:Aversions (1/50)]] * DarthWiki.{{Aquilaverse}}: ''Averted'' Not inexplicable, since her spell "Xenographus" let's her read any language she encounters for a limited duration. [[/folder]]
[[folder:Unclear (4/50)]] * Characters.FrightKrewe: His power makes him this. He has the ability to communicate with anyone and understand anything they say, whether its a [[InexplicableLanguageFluency person]], an [[SpeaksFluentAnimal animal]], or an [[NatureSpirit 'elemental entity']]. '''The way the example is written, it's not clear whether it's this or BilingualDialogue.''' * Recap.CreepshowS1E3BadWolfDown: Doc is shown to speak French and understands the language quite fluently, allowing him to translate the explanation of the werewolf woman's plight. '''You can't "understand a language fluently". Does he ''speak'' it fluently, or at least speak it better than he should realistically be able to?''' * Series.AlloAllo: Frenchmen, Germans, and Italians can talk to each other. This is never explained. '''In their own languages, or...?''' * WesternAnimation.SpaceChimps: The [[SpeaksFluentAnimal aliens]] and [[InexplicablySpeaksFluentAlien chimps]] understand each other, despite that the chimps have just arrived on their planet. '''Does one of them speak the other's language, or do they just understand each other's languages?''' [[/folder]]
!!!'''Characters being able to understand a language without demonstrating fluency (24/50) 12%''' [[folder:Amplified Animal Aptitude (7/50) '''3.5%''']] * Literature.MrsFrisbyAndTheRatsOfNIMH: The animals can understand humans and each other, but can't talk to humans. This leads to a bit of FridgeLogic as to where exactly all these animals picked up English (justified for the rats, as they were taught it in the lab, but not for any of the others), and whether they can understand other human languages automatically as well. * Recap.TheSimpsonsS6E6TreehouseOfHorrorV: When Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." * TheSimpsons.TropesIToM: ** In "Treehouse of Horror V", when Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." '''Just checked and the sloth doesn’t say “I don’t know”, it just grunts with an intonation that indicates that that’s what it was saying. So, not fluent in English.''' * Tarzan.TropesGToZ ** Played straight when Jane talks to a baby baboon and he understands her. * TotalDrama.TropesHToP: Some wild animals--especially squirrels--can understand humans. * WesternAnimation.FindingNemo: Dory understands English. The fish in the tank understand English because they've been hearing humans all their life, but there's no reason for Dory to understand English. [[spoiler:Subverted in that ''WesternAnimation/FindingDory'' reveals she was raised in a marine life institute in California.]] '''Since it says “understand” and not “speak” I can only assume this is a case of TranslationConvention and AnimalTalk, which means they aren’t actually fluent in English.''' * WesternAnimation.SpiritStallionOfTheCimarron: Spirit, despite not having had any human contact before his capture, understands what the human characters are saying. [[/folder]]
[[folder:Bilingual Dialogue (11/50)]] * Characters.MashupWeekMegamixCompetitors: Despite the fact that all 3 members speak different languages (English for Scoop, French for Eric and Italian for Baba), they can be understood by each other and by others easily. * Film.NoTimeToDie: James Bond's 5-year-old daughter, Mathilde, understands English even though her mother Madelaine speaks to her in French and all the media she consumes is in French. * Film.Titanic1997: Jack and the Swedish men understand each other well enough that Olaf is able to bet his tickets for all Jack and Fabrizio's money. * FinalFantasyXIV.TropesGToI?: ** In ''Heavensward'', the Warrior and Alphinaud are startled by how they can understand the great wyrm Hraesvelgr, who only speaks the language of dragons. They learn that while other races may not speak the dragon's language, the inherent power and magic in a dragon's song essentially beams meaning directly into the hearer's mind. ** The Scions are shocked in ''Shadowbringers'' when they meet [[spoiler:Emet-Selch's recreation of the lost city of Amaurot, the home of the BenevolentPrecursor race. Although the ancient version of mankind spoke only in a StarfishLanguage of droning tones, the Scions born millennia after the world was sundered find that they're able to discern meaning from it.]] * Literature.CookingWithWildGame: Lampshaded. One of the few supernatural parts of the setting is that [[TrappedInAnotherWorld Asuta]] can understand its inhabitants' language (almost) perfectly. Nobody knows why, nor do they ever find out; it's essentially one of the AcceptableBreaksFromReality needed to get the plot going. * Literature.{{Fablehaven}}: After receiving magic from fairies, Kendra gains the ability to interpret their language intuitively despite never actually learning it. * Manga.MiriyaAndMarie: Aside of occasional GratuitousFrench, Miriya can understand all French and animal characters and vice versa. Blackey also tries to charge Miriya for massage in yen despite being French. * VideoGame.KnightsOfTheOldRepublic: Yes, your character understands plenty of languages, but there isn't anyone who ''should'' understand the language of the AbusivePrecursors. [[spoiler: Subverted. Turns out you'd been there before and learned the language, but just forgotten the whole thing between Malak's backstab and whatever the Jedi did to "rebuild" you]]. '''When moving this to BilingualDialogue, I would rewrite it as “Your character understands plenty of languages, but can only speak one.''' * WebOriginal.MashupWeek: In both tournaments, Flat Eric speaks French, but everyone can understand him perfectly. Same with Gigi, who speaks Italian. * WesternAnimation.JosieAndThePussycats: ''[[RecycledWithAGimmick In Outer Space]]'' has Melody encounter an alien creature that looks like cotton candy with eyes and flippers. The critter speaks in a series of bleeps, and Melody names him Bleep. [[InexplicablySpeaksFluentAlien She can also understand him perfectly, and acts as translator for everyone else]]. Also, Bleep seems to understand English just fine. '''I also think InexplicablySpeaksFluentAlien should be limited to people actually speaking Alien instead of just understanding it, but that’s for another time.''' [[/folder]]
[[folder:Other (6/50)]] * Characters.InfinityTrainSeekerOfCrocusMultiverse: Upon looking at some files from the old phone Oscar picked up, he's somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Fanfic.InfinityTrainSeekerOfCrocus: Kaito is somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Characters.YaBoyKongming: ** Kongming also demonstrates an ability to understand English, as he is able to tell that the English song Eiko sings to him is a love song, with enough comprehension to be moved by a vision of his old comrades due to the lyrics referencing being still alive and alone. Later, upon meeting Tsuyoshi Kondo after Eiko's performance at the Yoyogi Art Festival, Kongming easily translates his GratuitousEnglish for a confused Kobayashi, and he's even become friends with a foreigner in Roppongi, who speaks exclusively in English. He's capable of some GratuitousEnglish himself, to boot. * Recap/CreepshowS2E2PublicTelevisionOfTheDead: Goodman reveals to Ted, out of nowhere, that he can read Sumerian, allowing him to translate the Necronomicon's incantations and let Evil roam free once again. * Series.{{Carnivale}}: During the fireball show, Samson secretly passes an old Crusader fob up to the stage during Lodz's psychometry act. When Lodz touches it, he's struck by visions of a holy war and begins chanting ''"In hoc signo vinces!"'' Ben, an uneducated farmboy, surprises Samson by translating even though he doesn't even recognize the language is Latin: '''Ben Hawkins:''' By this sign we conquer... by this sign we conquer... * WesternAnimation.SpaceChimps2ZartogStrikesBack: Zartog understands the humans, despite having just arrived on Earth. He even understands Spanish, Hindi, and Xhosa upon hearing them for the first time, making the humans' inability to understand him even more hilarious. [[/folder]]
submitted by Similar_Set_6582 to bannedtvtropers [link] [comments]


2024.06.01 12:11 Similar_Set_6582 My account was banned from TV Tropes. Can someone do me a favor?

Just copy this, paste this into a sandbox, then make a thread in the Trope Repair Shop linking to it. The thread shall be titled "Needs Help: Inexplicable Language Fluency".
A wick check for InexplicableLanguageFluency.
'''Why?''': InexplicableLanguageFluency seems to lump in examples of characters demonstrating fluency in a language they've never learned with examples of characters merely understanding a language they never learned. For one, it doesn't make sense for the latter examples to be there since the title has the word "fluency" in it. Secondly, many of the latter examples can be moved to BilingualDialogue or AmplifiedAnimalAptitude.
'''Wicks checked''': 50/72
'''Wick totals''':
* 21/50 wicks, or 11.5%, were clear examples of a character demonstrating fluency in a language they never learned. * 24/50 wicks, or 12% were examples of a character understanding a language without demonstrating fluency. * 4/50 wicks, or 2%, were written in a way that made it unclear whether they were being used correctly. * 1/50 wicks, or 1.5%, was an aversion.
[[folder:Characters demonstrating fluency in a language they never learned (21/50)]] * Characters.AtlantisTheLostEmpire: The Atlanteans are able to fluently communicate with the explorers from the surface in English and French. This is handwaved as their own language being a precursor to all other languages. * Characters.YaBoyKongming: ** Kongming finds himself immediately fluent in Japanese despite having never learned the language, and the Japan of his era was still a dwarf country that mostly only sent envoys to Shu's rival kingdom of Wei. Eiko points out the absurdity of how an ancient Chinese figure can suddenly speak Japanese, and Kongming himself is at a loss of how he can do this. * Characters.YellowjacketsCrashSurvivors: [[spoiler:She, and by extension others in the Antler Queen cult she forms, randomly speak French on occasion, which Lottie at least is ''not'' fluent in, indicating mental instability.]] '''If this is supposed to be an inversion, it's correct, but it should be rewritten to mention that it's an inversion.''' * ComicBook.Asterix: Gauls can speak to most foreign peoples without any difficulty. Either everyone can speak Gaulish, or the Gauls can speak other languages, but either way is downright impossible. * ComicBook.UltimateFantasticFour: ** Downplayed. Namor demonstrates this trope after spending nine thousand years of being sealed inside a sarcophagus prison. Being from an ancient civilization of Atlantis that predates the modern English language, he initially has no idea what his new captors are saying when they find him. However, thanks to his psionic powers he proves to be a frighteningly fast learner and is able to learn English within an hour to understand modern American English and casually deliver threats and demands. ** Happens again when the Four meets the Silver Surfer who is able to understand English perfectly the instant he arrives to Earth for the first time. He justifies this by stating can decipher electromagnetic signals in the air to learn alien tongues in seconds. * Friends.TropesFToJ: Zig-zagged. "The One Where Ross Can't Flirt" has her speaking with Joey's Italian grandmother, with Phoebe seeming as surprised about this as Joey. But in an earlier episode, Rachel's former lover Paulo calls her "bellissima" and she doesn't understand. ->'''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Film.ExorcistTheBeginning: In-universe, the main characters note Cheche's sudden fluency in English after being brought to the hospital. Of course this is because it is Pazuzu speaking through him and not actually Cheche. '''This is DiscussedTrope, not InUniverse'''. * Film.{{Phenomenon}}: Downplayed. After a UFO encounter, George gains a form of superintelligence where he can learn complicated tasks and subjects quickly, to the point where he's able to fluently speak Portuguese after reading an English-to-Portuguese dictionary for twenty minutes. This was foreshadowed in an earlier scene where after usually fumbling to speak Spanish, he's suddenly able to speak it perfectly, to the point that his Hispanic coworker half-seriously declares that it's better than his. * Literature.ForgottenRuin: When Talker and Vandahar speak during the final stretch to the Hidden Cave, Vandahar speaks in what we would recognize as modern German, which of course makes communicating with him very easy for Talker but raises a few questions. * Literature.InCryptid: Of the GeneticMemory version. When Candice the Dragon Princess comes up against the LizardFolk wandering about the New York sewers, she's able to make them halt using a guttural language. She reveals to Verity that it's a inborn language all Princesses know and something the "Servitors" are meant to respond to. She's also never used it before as Servitors can only be created from a [[spoiler:living male dragon]]. * Literature.SeasonalFears: Aven is essentially in a coma all her life but, thanks to the alchemical knowledge tinctures of her father', she speaks perfect English even if there are some issues with definitions. * Literature.ShiversMDSpenser: Bubbie moves to France with her family and meets a new French kid, Jean-Luc, and can somehow converse with him without any language issues; at the end of the story she can hold a long conversation with a stranger in French, despite spending less than a week in the country. Previous books with an overseas setting will at least attempt justifying this trope (for instance, ''Terror on Troll Mountain'' has the protagonist's Italian family choosing to speak in English because they're anticipating American guests) but this one handwaves the issue until it's a borderline plot hole. * Recap.AsterixAndTheGoths: Getafix is able to speak fluent Gothic, despite being a Gaulish druid who has no logical reason to know it (especially considering how antagonistic the Goths are in this story). * Recap.FriendsS5E19TheOneWhereRossCantFlirt: Phoebe speaks with Joey's Italian grandmother, with [[InexplicableLanguageFluency Phoebe seeming as surprised about this as Joey]]. '''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Recap.TalesFromTheDarksideS4E6TheGraveRobber: Aileen is surprised to hear Tapok speaking English when he wakes up, the mummy giving an offhand mention that he can speak the language of anyone who violates the tomb. * Roleplay.WanyaKingdomVSAwoofyUnity: Fragment has been shown to speak any language even if he has never heard it before. * Series.{{Llanargollen}}: In "Dirgelwch y Llyfr Coll", Prys and Barti speak to each other in Swahili. It seems incredibly unlikely that Prys would be fluent in that language, and Barti only speaks in unintelligible grunts with individual words thrown in. The other characters barely react to this. * VideoGame.EternalReturnBlackSurvival: The test subjects are able to work together with each other in spite of how one would presume that the vast majority of them would be completely unable to verbally communicate with most of the others. {{Averted}} by some of the achievement lore entries, which include statements observed from test subjects that some of which have a note underneath which states the words were translated by AI and will be reviewed and localized further in the future. '''Since this seems to be an implied example rather than a straight example, and it seems to be ambiguous InUniverse whether it's this or BilingualDialogue, I'll leave it here.''' * VideoGame.MegaManBattleNetwork2?: [=MegaMan=]'s translation system allows Lan to speak flawless Netopian after they arrive in Netopia. * WesternAnimation.{{Pocahontas}}: English explorer John Smith meets the alluring Pocahontas of the Powhatan tribe. Smith doesn't know her language, nor she his. That is, until Pocahontas takes a breath of magic wind. Suddenly, her English is better than his. [[/folder]]
[[folder:Aversions (1/50)]] * DarthWiki.{{Aquilaverse}}: ''Averted'' Not inexplicable, since her spell "Xenographus" let's her read any language she encounters for a limited duration. [[/folder]]
[[folder:Unclear (4/50)]] * Characters.FrightKrewe: His power makes him this. He has the ability to communicate with anyone and understand anything they say, whether its a [[InexplicableLanguageFluency person]], an [[SpeaksFluentAnimal animal]], or an [[NatureSpirit 'elemental entity']]. '''The way the example is written, it's not clear whether it's this or BilingualDialogue.''' * Recap.CreepshowS1E3BadWolfDown: Doc is shown to speak French and understands the language quite fluently, allowing him to translate the explanation of the werewolf woman's plight. '''You can't "understand a language fluently". Does he ''speak'' it fluently, or at least speak it better than he should realistically be able to?''' * Series.AlloAllo: Frenchmen, Germans, and Italians can talk to each other. This is never explained. '''In their own languages, or...?''' * WesternAnimation.SpaceChimps: The [[SpeaksFluentAnimal aliens]] and [[InexplicablySpeaksFluentAlien chimps]] understand each other, despite that the chimps have just arrived on their planet. '''Does one of them speak the other's language, or do they just understand each other's languages?''' [[/folder]]
!!!'''Characters being able to understand a language without demonstrating fluency (24/50) 12%''' [[folder:Amplified Animal Aptitude (7/50) '''3.5%''']] * Literature.MrsFrisbyAndTheRatsOfNIMH: The animals can understand humans and each other, but can't talk to humans. This leads to a bit of FridgeLogic as to where exactly all these animals picked up English (justified for the rats, as they were taught it in the lab, but not for any of the others), and whether they can understand other human languages automatically as well. * Recap.TheSimpsonsS6E6TreehouseOfHorrorV: When Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." * TheSimpsons.TropesIToM: ** In "Treehouse of Horror V", when Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." '''Just checked and the sloth doesn’t say “I don’t know”, it just grunts with an intonation that indicates that that’s what it was saying. So, not fluent in English.''' * Tarzan.TropesGToZ ** Played straight when Jane talks to a baby baboon and he understands her. * TotalDrama.TropesHToP: Some wild animals--especially squirrels--can understand humans. * WesternAnimation.FindingNemo: Dory understands English. The fish in the tank understand English because they've been hearing humans all their life, but there's no reason for Dory to understand English. [[spoiler:Subverted in that ''WesternAnimation/FindingDory'' reveals she was raised in a marine life institute in California.]] '''Since it says “understand” and not “speak” I can only assume this is a case of TranslationConvention and AnimalTalk, which means they aren’t actually fluent in English.''' * WesternAnimation.SpiritStallionOfTheCimarron: Spirit, despite not having had any human contact before his capture, understands what the human characters are saying. [[/folder]]
[[folder:Bilingual Dialogue (11/50)]] * Characters.MashupWeekMegamixCompetitors: Despite the fact that all 3 members speak different languages (English for Scoop, French for Eric and Italian for Baba), they can be understood by each other and by others easily. * Film.NoTimeToDie: James Bond's 5-year-old daughter, Mathilde, understands English even though her mother Madelaine speaks to her in French and all the media she consumes is in French. * Film.Titanic1997: Jack and the Swedish men understand each other well enough that Olaf is able to bet his tickets for all Jack and Fabrizio's money. * FinalFantasyXIV.TropesGToI?: ** In ''Heavensward'', the Warrior and Alphinaud are startled by how they can understand the great wyrm Hraesvelgr, who only speaks the language of dragons. They learn that while other races may not speak the dragon's language, the inherent power and magic in a dragon's song essentially beams meaning directly into the hearer's mind. ** The Scions are shocked in ''Shadowbringers'' when they meet [[spoiler:Emet-Selch's recreation of the lost city of Amaurot, the home of the BenevolentPrecursor race. Although the ancient version of mankind spoke only in a StarfishLanguage of droning tones, the Scions born millennia after the world was sundered find that they're able to discern meaning from it.]] * Literature.CookingWithWildGame: Lampshaded. One of the few supernatural parts of the setting is that [[TrappedInAnotherWorld Asuta]] can understand its inhabitants' language (almost) perfectly. Nobody knows why, nor do they ever find out; it's essentially one of the AcceptableBreaksFromReality needed to get the plot going. * Literature.{{Fablehaven}}: After receiving magic from fairies, Kendra gains the ability to interpret their language intuitively despite never actually learning it. * Manga.MiriyaAndMarie: Aside of occasional GratuitousFrench, Miriya can understand all French and animal characters and vice versa. Blackey also tries to charge Miriya for massage in yen despite being French. * VideoGame.KnightsOfTheOldRepublic: Yes, your character understands plenty of languages, but there isn't anyone who ''should'' understand the language of the AbusivePrecursors. [[spoiler: Subverted. Turns out you'd been there before and learned the language, but just forgotten the whole thing between Malak's backstab and whatever the Jedi did to "rebuild" you]]. '''When moving this to BilingualDialogue, I would rewrite it as “Your character understands plenty of languages, but can only speak one.''' * WebOriginal.MashupWeek: In both tournaments, Flat Eric speaks French, but everyone can understand him perfectly. Same with Gigi, who speaks Italian. * WesternAnimation.JosieAndThePussycats: ''[[RecycledWithAGimmick In Outer Space]]'' has Melody encounter an alien creature that looks like cotton candy with eyes and flippers. The critter speaks in a series of bleeps, and Melody names him Bleep. [[InexplicablySpeaksFluentAlien She can also understand him perfectly, and acts as translator for everyone else]]. Also, Bleep seems to understand English just fine. '''I also think InexplicablySpeaksFluentAlien should be limited to people actually speaking Alien instead of just understanding it, but that’s for another time.''' [[/folder]]
[[folder:Other (6/50)]] * Characters.InfinityTrainSeekerOfCrocusMultiverse: Upon looking at some files from the old phone Oscar picked up, he's somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Fanfic.InfinityTrainSeekerOfCrocus: Kaito is somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Characters.YaBoyKongming: ** Kongming also demonstrates an ability to understand English, as he is able to tell that the English song Eiko sings to him is a love song, with enough comprehension to be moved by a vision of his old comrades due to the lyrics referencing being still alive and alone. Later, upon meeting Tsuyoshi Kondo after Eiko's performance at the Yoyogi Art Festival, Kongming easily translates his GratuitousEnglish for a confused Kobayashi, and he's even become friends with a foreigner in Roppongi, who speaks exclusively in English. He's capable of some GratuitousEnglish himself, to boot. * Recap/CreepshowS2E2PublicTelevisionOfTheDead: Goodman reveals to Ted, out of nowhere, that he can read Sumerian, allowing him to translate the Necronomicon's incantations and let Evil roam free once again. * Series.{{Carnivale}}: During the fireball show, Samson secretly passes an old Crusader fob up to the stage during Lodz's psychometry act. When Lodz touches it, he's struck by visions of a holy war and begins chanting ''"In hoc signo vinces!"'' Ben, an uneducated farmboy, surprises Samson by translating even though he doesn't even recognize the language is Latin: '''Ben Hawkins:''' By this sign we conquer... by this sign we conquer... * WesternAnimation.SpaceChimps2ZartogStrikesBack: Zartog understands the humans, despite having just arrived on Earth. He even understands Spanish, Hindi, and Xhosa upon hearing them for the first time, making the humans' inability to understand him even more hilarious. [[/folder]]
submitted by Similar_Set_6582 to OnlineFavors [link] [comments]


2024.06.01 12:10 Similar_Set_6582 My account was banned from TV Tropes. Can someone do me a favor?

Just copy this, paste this into a sandbox, then make a thread in the Trope Repair Shop linking to it. The thread shall be titled "Needs Help: Inexplicable Language Fluency".
A wick check for InexplicableLanguageFluency.
'''Why?''': InexplicableLanguageFluency seems to lump in examples of characters demonstrating fluency in a language they've never learned with examples of characters merely understanding a language they never learned. For one, it doesn't make sense for the latter examples to be there since the title has the word "fluency" in it. Secondly, many of the latter examples can be moved to BilingualDialogue or AmplifiedAnimalAptitude.
'''Wicks checked''': 50/72
'''Wick totals''':
* 21/50 wicks, or 11.5%, were clear examples of a character demonstrating fluency in a language they never learned. * 24/50 wicks, or 12% were examples of a character understanding a language without demonstrating fluency. * 4/50 wicks, or 2%, were written in a way that made it unclear whether they were being used correctly. * 1/50 wicks, or 1.5%, was an aversion.
[[folder:Characters demonstrating fluency in a language they never learned (21/50)]] * Characters.AtlantisTheLostEmpire: The Atlanteans are able to fluently communicate with the explorers from the surface in English and French. This is handwaved as their own language being a precursor to all other languages. * Characters.YaBoyKongming: ** Kongming finds himself immediately fluent in Japanese despite having never learned the language, and the Japan of his era was still a dwarf country that mostly only sent envoys to Shu's rival kingdom of Wei. Eiko points out the absurdity of how an ancient Chinese figure can suddenly speak Japanese, and Kongming himself is at a loss of how he can do this. * Characters.YellowjacketsCrashSurvivors: [[spoiler:She, and by extension others in the Antler Queen cult she forms, randomly speak French on occasion, which Lottie at least is ''not'' fluent in, indicating mental instability.]] '''If this is supposed to be an inversion, it's correct, but it should be rewritten to mention that it's an inversion.''' * ComicBook.Asterix: Gauls can speak to most foreign peoples without any difficulty. Either everyone can speak Gaulish, or the Gauls can speak other languages, but either way is downright impossible. * ComicBook.UltimateFantasticFour: ** Downplayed. Namor demonstrates this trope after spending nine thousand years of being sealed inside a sarcophagus prison. Being from an ancient civilization of Atlantis that predates the modern English language, he initially has no idea what his new captors are saying when they find him. However, thanks to his psionic powers he proves to be a frighteningly fast learner and is able to learn English within an hour to understand modern American English and casually deliver threats and demands. ** Happens again when the Four meets the Silver Surfer who is able to understand English perfectly the instant he arrives to Earth for the first time. He justifies this by stating can decipher electromagnetic signals in the air to learn alien tongues in seconds. * Friends.TropesFToJ: Zig-zagged. "The One Where Ross Can't Flirt" has her speaking with Joey's Italian grandmother, with Phoebe seeming as surprised about this as Joey. But in an earlier episode, Rachel's former lover Paulo calls her "bellissima" and she doesn't understand. ->'''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Film.ExorcistTheBeginning: In-universe, the main characters note Cheche's sudden fluency in English after being brought to the hospital. Of course this is because it is Pazuzu speaking through him and not actually Cheche. '''This is DiscussedTrope, not InUniverse'''. * Film.{{Phenomenon}}: Downplayed. After a UFO encounter, George gains a form of superintelligence where he can learn complicated tasks and subjects quickly, to the point where he's able to fluently speak Portuguese after reading an English-to-Portuguese dictionary for twenty minutes. This was foreshadowed in an earlier scene where after usually fumbling to speak Spanish, he's suddenly able to speak it perfectly, to the point that his Hispanic coworker half-seriously declares that it's better than his. * Literature.ForgottenRuin: When Talker and Vandahar speak during the final stretch to the Hidden Cave, Vandahar speaks in what we would recognize as modern German, which of course makes communicating with him very easy for Talker but raises a few questions. * Literature.InCryptid: Of the GeneticMemory version. When Candice the Dragon Princess comes up against the LizardFolk wandering about the New York sewers, she's able to make them halt using a guttural language. She reveals to Verity that it's a inborn language all Princesses know and something the "Servitors" are meant to respond to. She's also never used it before as Servitors can only be created from a [[spoiler:living male dragon]]. * Literature.SeasonalFears: Aven is essentially in a coma all her life but, thanks to the alchemical knowledge tinctures of her father', she speaks perfect English even if there are some issues with definitions. * Literature.ShiversMDSpenser: Bubbie moves to France with her family and meets a new French kid, Jean-Luc, and can somehow converse with him without any language issues; at the end of the story she can hold a long conversation with a stranger in French, despite spending less than a week in the country. Previous books with an overseas setting will at least attempt justifying this trope (for instance, ''Terror on Troll Mountain'' has the protagonist's Italian family choosing to speak in English because they're anticipating American guests) but this one handwaves the issue until it's a borderline plot hole. * Recap.AsterixAndTheGoths: Getafix is able to speak fluent Gothic, despite being a Gaulish druid who has no logical reason to know it (especially considering how antagonistic the Goths are in this story). * Recap.FriendsS5E19TheOneWhereRossCantFlirt: Phoebe speaks with Joey's Italian grandmother, with [[InexplicableLanguageFluency Phoebe seeming as surprised about this as Joey]]. '''Joey:''' Wow, Phoebs, you speak Italian?\\ '''Phoebe:''' Apparently. * Recap.TalesFromTheDarksideS4E6TheGraveRobber: Aileen is surprised to hear Tapok speaking English when he wakes up, the mummy giving an offhand mention that he can speak the language of anyone who violates the tomb. * Roleplay.WanyaKingdomVSAwoofyUnity: Fragment has been shown to speak any language even if he has never heard it before. * Series.{{Llanargollen}}: In "Dirgelwch y Llyfr Coll", Prys and Barti speak to each other in Swahili. It seems incredibly unlikely that Prys would be fluent in that language, and Barti only speaks in unintelligible grunts with individual words thrown in. The other characters barely react to this. * VideoGame.EternalReturnBlackSurvival: The test subjects are able to work together with each other in spite of how one would presume that the vast majority of them would be completely unable to verbally communicate with most of the others. {{Averted}} by some of the achievement lore entries, which include statements observed from test subjects that some of which have a note underneath which states the words were translated by AI and will be reviewed and localized further in the future. '''Since this seems to be an implied example rather than a straight example, and it seems to be ambiguous InUniverse whether it's this or BilingualDialogue, I'll leave it here.''' * VideoGame.MegaManBattleNetwork2?: [=MegaMan=]'s translation system allows Lan to speak flawless Netopian after they arrive in Netopia. * WesternAnimation.{{Pocahontas}}: English explorer John Smith meets the alluring Pocahontas of the Powhatan tribe. Smith doesn't know her language, nor she his. That is, until Pocahontas takes a breath of magic wind. Suddenly, her English is better than his. [[/folder]]
[[folder:Aversions (1/50)]] * DarthWiki.{{Aquilaverse}}: ''Averted'' Not inexplicable, since her spell "Xenographus" let's her read any language she encounters for a limited duration. [[/folder]]
[[folder:Unclear (4/50)]] * Characters.FrightKrewe: His power makes him this. He has the ability to communicate with anyone and understand anything they say, whether its a [[InexplicableLanguageFluency person]], an [[SpeaksFluentAnimal animal]], or an [[NatureSpirit 'elemental entity']]. '''The way the example is written, it's not clear whether it's this or BilingualDialogue.''' * Recap.CreepshowS1E3BadWolfDown: Doc is shown to speak French and understands the language quite fluently, allowing him to translate the explanation of the werewolf woman's plight. '''You can't "understand a language fluently". Does he ''speak'' it fluently, or at least speak it better than he should realistically be able to?''' * Series.AlloAllo: Frenchmen, Germans, and Italians can talk to each other. This is never explained. '''In their own languages, or...?''' * WesternAnimation.SpaceChimps: The [[SpeaksFluentAnimal aliens]] and [[InexplicablySpeaksFluentAlien chimps]] understand each other, despite that the chimps have just arrived on their planet. '''Does one of them speak the other's language, or do they just understand each other's languages?''' [[/folder]]
!!!'''Characters being able to understand a language without demonstrating fluency (24/50) 12%''' [[folder:Amplified Animal Aptitude (7/50) '''3.5%''']] * Literature.MrsFrisbyAndTheRatsOfNIMH: The animals can understand humans and each other, but can't talk to humans. This leads to a bit of FridgeLogic as to where exactly all these animals picked up English (justified for the rats, as they were taught it in the lab, but not for any of the others), and whether they can understand other human languages automatically as well. * Recap.TheSimpsonsS6E6TreehouseOfHorrorV: When Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." * TheSimpsons.TropesIToM: ** In "Treehouse of Horror V", when Homer is sent back in time to a prehistoric era, and has to avoid stepping on anything to change the future, he ends up swatting a mosquito then asks for reassurance that it won't change the future. He is understood despite being in an era before English existed, since a sloth behind him shrugs and grunts as if to say "I don't know." '''Just checked and the sloth doesn’t say “I don’t know”, it just grunts with an intonation that indicates that that’s what it was saying. So, not fluent in English.''' * Tarzan.TropesGToZ ** Played straight when Jane talks to a baby baboon and he understands her. * TotalDrama.TropesHToP: Some wild animals--especially squirrels--can understand humans. * WesternAnimation.FindingNemo: Dory understands English. The fish in the tank understand English because they've been hearing humans all their life, but there's no reason for Dory to understand English. [[spoiler:Subverted in that ''WesternAnimation/FindingDory'' reveals she was raised in a marine life institute in California.]] '''Since it says “understand” and not “speak” I can only assume this is a case of TranslationConvention and AnimalTalk, which means they aren’t actually fluent in English.''' * WesternAnimation.SpiritStallionOfTheCimarron: Spirit, despite not having had any human contact before his capture, understands what the human characters are saying. [[/folder]]
[[folder:Bilingual Dialogue (11/50)]] * Characters.MashupWeekMegamixCompetitors: Despite the fact that all 3 members speak different languages (English for Scoop, French for Eric and Italian for Baba), they can be understood by each other and by others easily. * Film.NoTimeToDie: James Bond's 5-year-old daughter, Mathilde, understands English even though her mother Madelaine speaks to her in French and all the media she consumes is in French. * Film.Titanic1997: Jack and the Swedish men understand each other well enough that Olaf is able to bet his tickets for all Jack and Fabrizio's money. * FinalFantasyXIV.TropesGToI?: ** In ''Heavensward'', the Warrior and Alphinaud are startled by how they can understand the great wyrm Hraesvelgr, who only speaks the language of dragons. They learn that while other races may not speak the dragon's language, the inherent power and magic in a dragon's song essentially beams meaning directly into the hearer's mind. ** The Scions are shocked in ''Shadowbringers'' when they meet [[spoiler:Emet-Selch's recreation of the lost city of Amaurot, the home of the BenevolentPrecursor race. Although the ancient version of mankind spoke only in a StarfishLanguage of droning tones, the Scions born millennia after the world was sundered find that they're able to discern meaning from it.]] * Literature.CookingWithWildGame: Lampshaded. One of the few supernatural parts of the setting is that [[TrappedInAnotherWorld Asuta]] can understand its inhabitants' language (almost) perfectly. Nobody knows why, nor do they ever find out; it's essentially one of the AcceptableBreaksFromReality needed to get the plot going. * Literature.{{Fablehaven}}: After receiving magic from fairies, Kendra gains the ability to interpret their language intuitively despite never actually learning it. * Manga.MiriyaAndMarie: Aside of occasional GratuitousFrench, Miriya can understand all French and animal characters and vice versa. Blackey also tries to charge Miriya for massage in yen despite being French. * VideoGame.KnightsOfTheOldRepublic: Yes, your character understands plenty of languages, but there isn't anyone who ''should'' understand the language of the AbusivePrecursors. [[spoiler: Subverted. Turns out you'd been there before and learned the language, but just forgotten the whole thing between Malak's backstab and whatever the Jedi did to "rebuild" you]]. '''When moving this to BilingualDialogue, I would rewrite it as “Your character understands plenty of languages, but can only speak one.''' * WebOriginal.MashupWeek: In both tournaments, Flat Eric speaks French, but everyone can understand him perfectly. Same with Gigi, who speaks Italian. * WesternAnimation.JosieAndThePussycats: ''[[RecycledWithAGimmick In Outer Space]]'' has Melody encounter an alien creature that looks like cotton candy with eyes and flippers. The critter speaks in a series of bleeps, and Melody names him Bleep. [[InexplicablySpeaksFluentAlien She can also understand him perfectly, and acts as translator for everyone else]]. Also, Bleep seems to understand English just fine. '''I also think InexplicablySpeaksFluentAlien should be limited to people actually speaking Alien instead of just understanding it, but that’s for another time.''' [[/folder]]
[[folder:Other (6/50)]] * Characters.InfinityTrainSeekerOfCrocusMultiverse: Upon looking at some files from the old phone Oscar picked up, he's somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Fanfic.InfinityTrainSeekerOfCrocus: Kaito is somehow able to read ''Aztec glyphs'' just by looking at it. [[spoiler:This is probably due to his covenant with Agares, as one of Agares' ability is to let his conjurer read and understand different languages.]] * Characters.YaBoyKongming: ** Kongming also demonstrates an ability to understand English, as he is able to tell that the English song Eiko sings to him is a love song, with enough comprehension to be moved by a vision of his old comrades due to the lyrics referencing being still alive and alone. Later, upon meeting Tsuyoshi Kondo after Eiko's performance at the Yoyogi Art Festival, Kongming easily translates his GratuitousEnglish for a confused Kobayashi, and he's even become friends with a foreigner in Roppongi, who speaks exclusively in English. He's capable of some GratuitousEnglish himself, to boot. * Recap/CreepshowS2E2PublicTelevisionOfTheDead: Goodman reveals to Ted, out of nowhere, that he can read Sumerian, allowing him to translate the Necronomicon's incantations and let Evil roam free once again. * Series.{{Carnivale}}: During the fireball show, Samson secretly passes an old Crusader fob up to the stage during Lodz's psychometry act. When Lodz touches it, he's struck by visions of a holy war and begins chanting ''"In hoc signo vinces!"'' Ben, an uneducated farmboy, surprises Samson by translating even though he doesn't even recognize the language is Latin: '''Ben Hawkins:''' By this sign we conquer... by this sign we conquer... * WesternAnimation.SpaceChimps2ZartogStrikesBack: Zartog understands the humans, despite having just arrived on Earth. He even understands Spanish, Hindi, and Xhosa upon hearing them for the first time, making the humans' inability to understand him even more hilarious. [[/folder]]
submitted by Similar_Set_6582 to FuckFighteer [link] [comments]


2024.06.01 12:08 BL4CKL3R Artifacts suddenly appear in Games when Second monitor starts playing videos

Hi All,
I hope people can help me solve this issue.
While playing mindless games like WoW, Diablo, etc. I like to watch videos on my 2nd Monitor. Unfortunately, every now and again, it seems to be causing weird artifacts on my Main monitor. It usually happens like this.
I'm in Game, doing whatever I'm doing. I then go to my 2nd monitor load Netflix/YouTube/Insert Streaming Service here and start watching something. I then click back to my game and both monitors flicker as if changing resolution or input and come back quickly but now my main monitors game is artifacted and almost looks like its corrupted in some fashion (Text is illegible, certain on screen UI assets are discoloured and cannot be clicked like for example settings/menu as if I'm not clicking in the right area on screen).
To solve this I'm forced to close my game, reload it and after that it's fine, however, I should point out that my main monitor is also HDR and it goes from being "vibrant and crisp" to being "washed out and dull" until I restart my PC.
So I Checked and HDR is still ticked in the Display Settings insinuating its on, but what I see says otherwise.
I've tried the following:
I'm slightly concerned it's my GPU getting ready to burn out. Which I REALLY hope it isn't.
Specs are as follows:
MB: MSI X570S Carbon Max
GPU: MSI RTX 3080 Suprim
CPU: Ryzen 7 5800X3D
RAM: 2 x Corsair Vengeance 16gb (32GB) DDR4 3600 MHz C18
PSU: Corsair RM850x 80 Plus Gold
OS: WINDOWS 11
Monitor 1: Samsung 34" G55T UWQHD 165Hz Odyssey
Monitor 2: Acer 1080p Cheap Monitor inherited from work
Any help would be greatly appreciated. Thanks all.
Link to screenshot of WoW during Artifacting:
https://ibb.co/h2FnB7
https://ibb.co/LNcVs5V
https://ibb.co/dgfqjHL
Please note my FPS has also dropped in this screenshot from 120 to 10 as well.
I also noticed a new Error from Netflix but it's help centre is limiting.
https://ibb.co/x3hz2bZ
Netflix Error D7353-5102-6 Netflix Help Center
submitted by BL4CKL3R to techsupport [link] [comments]


2024.06.01 12:08 sartoriallyspeaking Building up distance and/or speed when only running 3x per week

I currently run 3x per week. It works really well for me and I absolutely do not want to add more runs per week. I have been running consistently for a year and have no injuries to account for.
My current goals are:
– to have a 5k race pace of 30:59 or lower. My current best time is around 32 or 33 minutes.
– to be able to run 10k at any pace consistently and healthily
– to maintain running as something I enjoy, which means (for me) that I can't turn it into something that requires as close to perfection as possible (optimize training, optimize macros, optimize form, optimize shoes, etc.). It needs to be something I just...do.
My three weekly runs are currently 23.5 minutes and I always run at 8.5 kph. I am adding 10% each week. I know that I could easily run farther, but I am trying to build up distance at a rate that will reduce the chance of injury and increase the chance of sticking with it.
My thought is that once I reach 30:00 distance, i.e. roughly my goal 5k time, that I could split the runs as follows:
A: continue @ 8.5kph and increasing distance by 10% each week
B: run for 30 minutes, increasing speed by X (0.1kph?) each week
C: sprint 30s, jog until heart rate lowers, repeat for X minutes
Does that sound like a reasonable plan? Are there any obvious flaws?
Thank you so much.
submitted by sartoriallyspeaking to XXRunning [link] [comments]


2024.06.01 12:01 Lenic707 [Hobby] Noob Environment designer looking for like-minded people who want to learn/work together

TL;DR: Want to find kind-hearted, respectful people to learn alongside and hopefully even work on fantasy rpg projects with
I am not new to game design but i would still consider myself to be inexperienced because i have been dabbling with environment/level design in UE for the past couple years but i haven't gotten really far just because lack of motivation and a solid goal to reach for. also just a lack of game design friends.
I tried INAT before and i met some cool and diverse people and made a game design discord server with them and even started working on prototypes but everyone had a different dream game they wanted to make so it didn't really last long so i want to find people who want to work on the same types of games i do which is unlikely but i figured i'd post just in case i get lucky.
The types of games i would want to work on would be ones that include, D&D-esque fantasy settings(or even future-fantasy settings), magic systems, life and combat skills, tab-targeted combat or soulslike combat, character customization, stories/quests with a least a little bit of depth to them(not the generic fetch quests), npc's with depth and character, different nations/cultures, and technology affected by the presence of magic.
The main goal i would want to work towards is a fantasy classless rpg that is heavily skill and exploration based that puts emphasis on just exploring the world and picking up skills as well as leveling said skills along the way to allow you to build your character the way you want ideally without the motivation to just choose a meta. think games like Project Gorgon or Kenshi if you've heard of them.
I've always been in love with game environments, i started with WoW and Skyrim and went from there. Ever since i was young i wanted to create a world that gave that same feeling of adventure and wonderlust, i love mmorpgs specifically but they have become very similar in that most put the player character on rails and heavily guide the player along a certain path which is not a bad thing, ffxiv does that and it's a fantastic game but i also love ttrpgs like D&D and Pathfinder that give players the freedom to weave their own story. I'm not saying i want to make an mmorpg as that is a laughably unrealistic goal to give myself, but i want to make a game that evokes the same feelings.
I'm not looking to immediately get to work on a game but i want to find like minded people who want to learn together and practice working together. Also we don't have to work on my idea specifically but i would prefer staying in the same ballpark. (i.e. classless skills, combat, magic, some sort of fantasy setting)
The tools i would prefer to use and have experience in are Unreal Engine and Git.
It doesn't really matter what field you want to specialize in as long as you're passionate, willing to learn, want/able to contribute, and want to just have fun on the journey.
Morals are important as well, i would like to work with people who are respectful and keep an open mind, in other words people who dont hate on lgbtq+ or aren't misogynistic or bigoted.
Reply in the comments or dm me if you're interested in working on projects together. Also it'd be appreciated if you tell me a bit about yourself in your comment/message. I would also appreciate if you only reply if you share a similar enough goal and if you share the same morals, i dont want to waste your time or mine if we dont share a common focus or if we wouldn't get along.
Some of my work: https://postimg.cc/gallery/sVFY4gZ
https://www.youtube.com/watch?v=HfMQGqEytgo&ab_channel=Jess
submitted by Lenic707 to INAT [link] [comments]


http://activeproperty.pl/