Can you take apple cider vinegar when taking coumadin

Kombucha

2009.04.22 04:55 lencioni Kombucha

Kombucha is a fermented, fizzy, tea-based drink made using a combination of bacteria and yeast. This sub is for homebrewers and others who appreciate kombucha.
[link]


2008.08.19 08:38 GERD, Acid Reflux and Heartburn

A subreddit for people with the condition know as Gastroesophageal reflux disease (GERD).
[link]


2011.07.06 18:14 ev149 Post Processing

[link]


2024.06.01 14:26 2Dement3D What's your BEST solo queue experience?

People are generally more likely to talk about their bad, frustrating experiences rather than their good ones, so let's try and discuss some of our better, positive moments.
What's your best experience so far in Solo queue?
I'll go first. I had match the other day that was incredibly brutal. I was Grux Jungle, while our Midlane was a Morigesh, and our Offlane was an Aurora. The other team had FOUR (4) Magic users, in Offlane Shinbi, Jungle Aurora, Mid Gideon, and Support Gadget. The current Magic-based Meta feels crazy, so I knew immediately it was going to be a rough match if they knew what they were doing, and they did.
Match begins, and our Aurora starts trying to Jungle. I ping to tell them to go defend Offlane and they run off and do so. I assumed they just forgot they were Offlane. After a few minutes, they immediately run back into the Jungle as the Jungle minions respawn, and wipe them out. Then they run across the map and start taking out the Jungle minions next to Duo lane while I'm helping Mid. I had to run to Offlane and defend it. This happened throughout the match, and I would have been fine to adjust and go full Offlane while they Jungle, but they were not trying to gank anyone. They were just killing the Jungle minions then running back to Offlane, regardless if their towers were being attacked. I ended up running to protect Offlane for half the match because of this, and ended up woefully underpowered.
The match overall was going very, very poorly. Morigesh was the only one with a positive KDA on our team. The whole thing felt like a massive write off, with no silver lining in sight. I noticed the enemy team was trying to take Orb Prime, so I thought, maybe we could get the ball rolling if I manage to Kamikaze steal it at the last second. I kept an eye on it and when it was the moment to strike, I jumped over the wall to get it and... got completely nuked as they took it.
"Grux.......", Morigesh puts in the chat. "Don't blame me for this match dude, I've had to spend all my time in Offlane because Aurora's rarely there", I respond.
"No, I'm not blaming. Let's not blame anyone, let's just group together and turn this game around" - Morigesh, 2024
Hearing that alone changed my entire mentality. You know the match is doomed. You know you're really the only one pulling their weight on the team, and you still have such a positive outlook? Well damn, if you can remain positive during this mess, then I will too.
I don't really know how to explain it, but after that, we all started working together, earning money, getting experience, killing players, catching up to the other team in terms of levels and items, and we ended up turning the game around deep into the late game. It was a full hour long match. Make no mistake, this Morigesh HARD CARRIED our team, both mentally and literally (they had like 30 kills by the end), but the way we all managed to go from the sorriest team you've ever seen to actually banding together? Felt like a damn Disney movie and my favorite experience so far with this game.
submitted by 2Dement3D to PredecessorGame [link] [comments]


2024.06.01 14:25 Happy-Necessary-6835 WIBTAH- to try to get closure?

Hi I (21, female) to get closure from my ex best friend. I was best friends with her since high school. I had a group of friends that we created in high school. Okay so full story is that, there was a guy who liked to flirt and harassed people to the point they were uncomfortable. My ex best friend came to me about this and we had our paragraphs moments back and forth. She started to trauma dump me about everything. I told her I’ll talk to this guy about it and make him apologize to the people he has harassed and set up boundaries. When she trauma dump me I felt overwhelmed because yes I want to help out my friends who are being harassed by this person. He a grown ass guy that should know what are the wrong and rights. Then I had friends leaving me because I was “friends” with the guy who liked to harassed people. I never said I was friends with him. But any ways I wanted to know that part of this semi story. To know the full context.
My best friend and I would talk about everything. The thing is everyone knew she was a bit toxic in her way but I didn’t see it. (Just to let you know I suck recognizing at who bad or good at people.) Until my boyfriend pointed it out to me what she been doing to me and making me feel. Every time I was near her I would feel drained or I would be upset because I felt stressed around her. The group we had were kind of like hypocrites, can’t say that but they can. Like if the gay can say the f word but other gay can’t say it because it wrong. Bad example, but we been having her and I problems. That I haven’t been hanging out with her but when I want to hang out with her she would cancel last minute. She would have a problem with my boyfriend. Saying that he bickering at her or other things. To the point she wrote me so many paragraphs, saying that her and I been taking our path. She saw me as her sister but now she doesn’t. She ended up blocking me for two months and then unblocked me like a couple weeks ago. I just need other perspective. There are other things forgot to mention but please don’t come to get me for it. I can post the screenshot to understand the situation. It just I needed to know if I was the ass in this story. Thank you for listening and I’m really sorry for my bad grammar.
submitted by Happy-Necessary-6835 to WouldIBeTheAhole [link] [comments]


2024.06.01 14:25 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to computervision [link] [comments]


2024.06.01 14:25 lonelytunes09 How to assess election results

TLDR: I want to give you benchmarks and scientific processes to understand the outcome of the 2024 election without any prejudice.
There are many opinion polls and analyses that are giving extreme predictions of 2024 outcome like one from Parakala Prabhakar predicts ~150 seats for BJP and some others predict 375. Coming from a family of politicians .Without any prejudice let me explain how the process works and how to get a correct sense of the election. Thumb rules: I will give a detailed explanation later, but please keep these rules in mind. 1. Elections are won by grassroots cadres who bring voters to the voting booth. You cannot win elections unless that happens.
  1. Voters vote in their best interest and they have clarity on that and the illiterate/poorly educated ones have better clarity. Voters are not stupid or could be fooled or people's mandate can be stolen.
  2. वक़्त करता है परवरिश बरसों, हादसा एक दम नहीं होता : Things doesn’t happen all of a sudden, it happens over a while. Likewise, people don’t change their minds just because X incident has happened. Let's say on a weighing scale while pouring peanuts on one pan, on the 100th peanut scale tilts on its side so we cannot categorically say that the 100th peanut was historic or there is something special about it.
  3. Media and money cannot buy votes.
  4. EVMs cannot be hacked/manipulated.
  5. In God we trust, others bring data.
Let’s say you go to a Mercedes showroom and say to the salesman that you have a strong feeling that I should own this Mercedes and even all my friends feel the same. You are not going to get that Mercedes unless the sum total of the money in your bank is more than what the price of the car. If not, then the salesman would ask you to get lost, the same thing you should tell anyone who analyses something based on their feeling. This is why analysis of Parakala P can be dismissed as he has mentioned in the beginning that it is his feeling.
Next in line are charlatans like Yogendra Yadav who with their sweet talk mix data with their feelings. He is taking data from previous elections and mixing his feelings to predict the outcome. I would trust the analysis of Pradeep Gupta or News24 Today’s Chanakya. These people have a very high rate of predicting election outcomes correctly, we need to understand their methods and how they can predict them correctly.
To understand their methods, we must first understand how people arrive at a decision to vote for a specific party.
Every party has 3 distinct types of voters:
  1. Core voters: These voters will vote for a given party even if the sun rises in the west. A classic example is BSP. Its core voters are the SC/ST community and without any effort or presence in media or SM in the 2022 UP election, they got ~13% vote. Every major party has 10-20% (of the total eligible voters).
The core voter base will vote for a party even without a leader, even if leaders engage in crimes or corruption, it doesn’t matter. Long back I asked an old man who would vote for in a gram panchayat polls? He said, Indira Gandhi. At that time, I thought the old man was delusional however, I now understand that he voted for Congress because he felt obligated to vote for Indira Gandhi even if she died 3 decades ago. A core voter feels a lifetime obligation towards a specific party for any specific incident that has changed their life. E.g. Core voters of Shiv Sena voters vote because Shiv Sainiks saved their lives in the riots post Babri demolition or BSP voters vote for Kanshi Ram because he educated them about their constitutional rights and gave them a dignified life.
  1. Loyal voters: These voters are loyal to a specific party/leader however, they have a defined fault line and if their party/leader crosses that fault line they vote against that party/leader. E.g. a BJP voter has a fault line of Hindutva or nationalism and if they feel that fault line is crossed they would vote against BJP to express their anger because that fault line is crossed. As long as the fault line is not crossed the party/leader engaged in crime or corruption doesn’t matter. The percentage of loyal voters changes each election, but you cannot win unless you have ~15-20% loyal voters. Floating voter: This voter would vote based on his criteria and perception however this kind of voter is usually in the range of 5% of the total voters. Core voters form the bulwark of the party and support the party voraciously even at it’s lowest point, these voters keep the party alive in its bad days, but you cannot win the elections without the support of the loyal voters. These voters along with the floating voters help the party cross the finish line which is usually 30-45%. When an incumbent party loses elections with a major loss, it usually means the loyal voters have voted against them. Floating voters matter when there is regime change because, in that kind of a fight, every vote counts. Bhau Torsekar , a Marathi journalist had accurately predicted 2014 and 2019 results without any survey and wrote books on he arrived at these figures. Let me summarise his methodology. He has identified swing seats, i.e. a constituency where the difference between the 1st and 2nd candidate should be less than 15% or BJP had won that seat in recent part, which he calls fight seats, i.e. seats where a party has a realistic chance of winning. That number was close to 300 and with Modi as PM candidate and Amit Shah as strategist of UP state that swing of 15% was achieved and in some cases even more and that is how the majority was won. Anyone who was closely involved in that election would know what fever was there to elect Modi. I personally remember that my family members who were loyalists of Congress had been angry with Congress since 90s and would say that they would now vote for Vajpayee just before going to vote and for Congress. Despite the disgruntlement and a charismatic leader like Vajpayee, they voted for Congress. The breaking point came when the Congress leadership behaved in a very high-handed manner with my Uncle and refusing to meet him, which was considered as a big insult by his supporters. BJP leadership grabbed the opportunity with both hands, had a series of meeting and scales were titled in 2014 election. 5 lakh voters switched from Congress to BJP and they became loyal voters of BJP. This fever I had seen in my colleagues and neighbours. With all the scandals of the world, bad publicity and wide-scale protests, Modi’s allure, and Shah’s micromanagement, unprecedented rise of 8% in the polls, the BJP’s vote share rose by just 12%. For India alliance to come tilt the scales, they will have to raise their vote share by at least 10% and I see no signs of that happening. Having disgruntled voters is one part, but disgruntled voters are not going to switch parties unless the opposition has addressed all the needs of the voters. Predicting 2024 elections: Please watch this video before making any opinion. https://www.youtube.com/watch?v=hwnnn1M-T5U . Here Shekhar Gupta speaks his observations on how he saw that Modi was able to deliver social benefit schemes to people( meanwhile he was making videos onhow bad modi's defeat would be in 2014 elections.) .
Most voters in India are directly dependent on the government’s social benefit schemes and it used to be a struggle to get those benefits from Sarkari babu. Modi turned the tables and made the Sarkari Babu struggle to deliver the promised benefit. The voters is mostly voting for his issues like water, food, social justice, and security. He is not bothered by Modi being a dictator or a democracy in danger. For them, issues can range from getting 5 Kg of free rice to road from village to highway. If that is done he is happy to vote for the leader who has done that.
Please take these factors into account.
  1. 55 swing seats are there for BJP.
  2. The vote share of the India alliance will fall. Why? Because the cadre is leaving these parties.
  3. The vote share of the BJP will rise because the above cadre has come to the BJP. Tag this with streamline and tight control of central leadership.
While most people will call Amit Shah as Chanakya, I see him as a top class administrator who has set processes for the BJP karyakartas to execute their task which is mind-boggling. These guidelines come with minute details like font size. There are leaders like Amit Shah or better in other parties however, they have not given a thought of putting down their experience on paper making guidelines, creating a robust feedback system, etc. You can consider BJP like McDonald's which has detailed processes to replicate the success in every franchise, while the other parties are like top restaurants in your city which are top class where they operate but cannot take their success beyond that restaurant because they have not made their restaurant process oriented.
The best-case scenario for BJP is 355 however, 325 would be a rational figure.
submitted by lonelytunes09 to unitedstatesofindia [link] [comments]


2024.06.01 14:24 Remote_Equivalent_86 Feeling like a total failure in life

18F I know I’m still young but I feel like a total failure in life I’m so tired I hate being alive everyday is so sad and depressing. I have a really bad habit of ghosting my friends for weeks when I feel super low and then end up losing contact and ending up being more lonely. I have no real friends I’ve never been in a relationship before in my life. Never experienced anything I feel like I am not meant for this life and society. I wish I can just disappear being alive is so sad and just painful. I’m tired of pushing and struggling with everyday life I don’t feel happy at all. I have no motivation and no true goals. I just wish I wasn’t here anymore. Also I am asian so if you’re expecting talking to a white girl now you can take back your dm sorry for being disappointing. Nobody will ever love me.
submitted by Remote_Equivalent_86 to lonely [link] [comments]


2024.06.01 14:24 Educational_Trip492 Onlyfans engagement advice

I’d like to offer some advice on engaging with your subscribers effectively. The key is to be someone they look forward to interacting with daily. Achieving this requires a deep understanding of what each subscriber enjoys—everyone has unique preferences, and no two subscribers are the same.
It's crucial to craft messages that resonate with them personally. Start by learning about them in your initial interactions. This can make the difference between subscribers eagerly awaiting your messages or feeling indifferent because they perceive you as genuinely interested rather than just trying to sell something.
Remember, your subscribers are intelligent, and understanding their psychology is essential. Every action you take should be strategic. For instance, knowing when each subscriber is online can significantly impact your message's effectiveness—timing is everything. Sending messages at the wrong time can come off as spammy and lead to disengagement.
Put yourself in their shoes. If you subscribed to a newsletter but received irrelevant daily emails, how often would you open them? To increase your message open rates, build a compelling narrative that excites your subscribers. Utilize social media stories to create anticipation—there are countless ways to do this.
If you’d like more personalized advice, feel free to message me on Instagram @ jaccoo24 , as I’m not on Reddit often and might miss your message here.
submitted by Educational_Trip492 to CreatorsAdvice [link] [comments]


2024.06.01 14:24 akworldservice How to Promote Your Blog and Achieve Growth with These Top 7 Strategies

How to Promote Your Blog and Achieve Growth with These Top 7 Strategies
How to Promote Your Blog and Achieve:
Ready to start a blog and establish yourself as an authority in your niche? Learn the top seven most effective methods for promoting your blog and gaining a steady stream of visitors. Find out how to earn money with your blog by using this guide. It talks about things like SEO and social media to help your brand grow and make more sales. Don't get lost in the sea of blogs, stand out and get noticed with these must-have promotion tips.
How to Promote a Blog: 7 Methods
. Having a blog is a fantastic way to get your brand noticed and share your knowledge. To make your blog successful, you need to promote it well and get lots of people to visit. Make sure to check your website thoroughly to make sure it loads quickly, is easy to navigate, and uses the best SEO practices for blogs.
Follow these seven super helpful tips to get your blog noticed by more people. You could end up with a bigger audience and even make some money!
  1. Social Media Marketing
There are lots of ways to get your blog out there without using social media, but using social media can really help your blog reach more people quickly. You can share your blog posts on sites like Facebook, Twitter, and Instagram to get more readers.
But don't just share the link to your latest blog post on all your social media accounts. Make a good plan for what you post and when you post it.
Designing high-quality content for your social media profiles requires dedication and time, but in the long run, captivating content will motivate your followers to browse your blog.. Fortunately, there are various tools available to streamline this process.
  1. Email Marketing
    You might be surprised to learn that email marketing is still one of the best ways to promote your business or blog. Even though some business owners don't believe it, almost half of consumers actually want to receive emails from their favorite brands every week. That's a lot of people who are interested!
Here are some tips for using email marketing to promote your blog:
You can try using an email marketing service such as Mailchimp or Aweber to make a simple form on your website. This way, you can gather people's email addresses.
After that, make an email newsletter with interesting stuff that your readers will enjoy reading.
Sometimes, you can send special deals or coupons to your subscribers for discounts on things in your niche.
  1. Use Paid Ads or PPC
You can promote your blog without spending any money using various methods, but some methods may require a small investment. One of the quickest ways to promote your blog is through PPC campaigns.
The most popular PPC advertising methods are Google, Facebook, and Twitter ads. It is a pay-per-click advertising platform that allows you to target specific keywords related to your niche. With Facebook or Twitter Ads, you can create PPC ads to reach specific audiences based on where they are, what they like, and who they are. These ads will be displayed in users' newsfeeds and sidebars on computers and mobile devices.
  1. Build Connections With Influencers
Social media is all about influencers who have a big following. And guess what? Their opinions really matter to their fans. So, if an influencer recommends your blog, people will definitely pay attention and take action. You shouldn't just go and ask famous influencers to promote your blog. They probably won't do it unless you offer them something valuable.
  1. Write Guest Posts
Writing guest blogs can help your post reach more people and should be included in your marketing plan.
Doing a guest post involves three important steps. First, you have to find blogs that allow guest posts and are relevant to your topic. Then, you need to reach out to them and share your idea for a guest blog. Finally, you have to write a high-quality guest post. It may seem easy, but it actually takes a lot of time and effort, especially when it comes to contacting bloggers and writing the post.
  1. Use RSS Feeds
Want to reach more people with your blog? Try using RSS (Really Simple Syndication). It's a format that helps you share new content like blog posts, news, and podcasts.
Imagine having a magic button on your blog that tells your readers, "Hey, there's something cool here!" That's what an RSS feed does. It keeps your readers updated with all the awesome stuff you post on your blog.
  1. Publish Blog Content on LinkedIn
If you're into sharing blog content, LinkedIn is the place to be, especially if you're in the B2B world. As of October 2022, LinkedIn had a whopping 875 million users, making it one of the largest networking platforms ever.
What's really cool about LinkedIn is that it doesn't just let you share your blog posts. You can actually create a brand new post, give it a boost, and even keep track of how many people have viewed it. This makes it super easy to figure out the best times and days to share your content on LinkedIn.
Apart from making posts on the platform to advertise your brand, you can also publish blog content as a LinkedIn article. This will give you additional exposure and help you establish your authority in the industry.
Conclusion
Writing great content is just the beginning of having a successful blog. If you want more people to read your blog, you have to promote it. This article gives you seven methods to effectively promote your blog and get more traffic. But remember, before you start using these methods, make sure your blog is ready to be promoted.
submitted by akworldservice to BloggersCommunity [link] [comments]


2024.06.01 14:24 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to learnmachinelearning [link] [comments]


2024.06.01 14:23 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to deeplearning [link] [comments]


2024.06.01 14:23 akworldservice How to Promote Your Blog and Achieve Growth with These Top 7 Strategies

How to Promote Your Blog and Achieve Growth with These Top 7 Strategies
How to Promote Your Blog and Achieve:
Ready to start a blog and establish yourself as an authority in your niche? Learn the top seven most effective methods for promoting your blog and gaining a steady stream of visitors. Find out how to earn money with your blog by using this guide. It talks about things like SEO and social media to help your brand grow and make more sales. Don't get lost in the sea of blogs, stand out and get noticed with these must-have promotion tips.
How to Promote a Blog: 7 Methods
. Having a blog is a fantastic way to get your brand noticed and share your knowledge. To make your blog successful, you need to promote it well and get lots of people to visit. Make sure to check your website thoroughly to make sure it loads quickly, is easy to navigate, and uses the best SEO practices for blogs.
Follow these seven super helpful tips to get your blog noticed by more people. You could end up with a bigger audience and even make some money!
  1. Social Media Marketing
There are lots of ways to get your blog out there without using social media, but using social media can really help your blog reach more people quickly. You can share your blog posts on sites like Facebook, Twitter, and Instagram to get more readers.
But don't just share the link to your latest blog post on all your social media accounts. Make a good plan for what you post and when you post it.
Designing high-quality content for your social media profiles requires dedication and time, but in the long run, captivating content will motivate your followers to browse your blog.. Fortunately, there are various tools available to streamline this process.
  1. Email Marketing
    You might be surprised to learn that email marketing is still one of the best ways to promote your business or blog. Even though some business owners don't believe it, almost half of consumers actually want to receive emails from their favorite brands every week. That's a lot of people who are interested!
Here are some tips for using email marketing to promote your blog:
You can try using an email marketing service such as Mailchimp or Aweber to make a simple form on your website. This way, you can gather people's email addresses.
After that, make an email newsletter with interesting stuff that your readers will enjoy reading.
Sometimes, you can send special deals or coupons to your subscribers for discounts on things in your niche.
  1. Use Paid Ads or PPC
You can promote your blog without spending any money using various methods, but some methods may require a small investment. One of the quickest ways to promote your blog is through PPC campaigns.
The most popular PPC advertising methods are Google, Facebook, and Twitter ads. It is a pay-per-click advertising platform that allows you to target specific keywords related to your niche. With Facebook or Twitter Ads, you can create PPC ads to reach specific audiences based on where they are, what they like, and who they are. These ads will be displayed in users' newsfeeds and sidebars on computers and mobile devices.
  1. Build Connections With Influencers
Social media is all about influencers who have a big following. And guess what? Their opinions really matter to their fans. So, if an influencer recommends your blog, people will definitely pay attention and take action. You shouldn't just go and ask famous influencers to promote your blog. They probably won't do it unless you offer them something valuable.
  1. Write Guest Posts
Writing guest blogs can help your post reach more people and should be included in your marketing plan.
Doing a guest post involves three important steps. First, you have to find blogs that allow guest posts and are relevant to your topic. Then, you need to reach out to them and share your idea for a guest blog. Finally, you have to write a high-quality guest post. It may seem easy, but it actually takes a lot of time and effort, especially when it comes to contacting bloggers and writing the post.
  1. Use RSS Feeds
Want to reach more people with your blog? Try using RSS (Really Simple Syndication). It's a format that helps you share new content like blog posts, news, and podcasts.
Imagine having a magic button on your blog that tells your readers, "Hey, there's something cool here!" That's what an RSS feed does. It keeps your readers updated with all the awesome stuff you post on your blog.
  1. Publish Blog Content on LinkedIn
If you're into sharing blog content, LinkedIn is the place to be, especially if you're in the B2B world. As of October 2022, LinkedIn had a whopping 875 million users, making it one of the largest networking platforms ever.
What's really cool about LinkedIn is that it doesn't just let you share your blog posts. You can actually create a brand new post, give it a boost, and even keep track of how many people have viewed it. This makes it super easy to figure out the best times and days to share your content on LinkedIn.
Apart from making posts on the platform to advertise your brand, you can also publish blog content as a LinkedIn article. This will give you additional exposure and help you establish your authority in the industry.
Conclusion
Writing great content is just the beginning of having a successful blog. If you want more people to read your blog, you have to promote it. This article gives you seven methods to effectively promote your blog and get more traffic. But remember, before you start using these methods, make sure your blog is ready to be promoted.
submitted by akworldservice to freelancing12 [link] [comments]


2024.06.01 14:22 elymX Toxic Team Member Caught Committing Fraud

Hello,
I just wanted to get some perspectives from professionals here regarding a certain individual in our team who is very toxic. For a bit of context, this person is extremely lazy and will only work on one project a day, sometimes not working at all. He simply logs in and logs out to get paid, passing all the open tickets, especially to the newbies on our team. My other teammate and I noticed this behavior since last year and we already reached out to our manager in the US to inform her about what was happening. She said she would talk to him, but a few months later, nothing changed.
My other teammate, who was a very hardworking person, could no longer tolerate his toxic behavior and just decided to leave the company, while this individual continued his toxic behavior. Fast forward to two weeks ago, this person filed for a week of vacation leave because he was getting married. When he returned, he claimed he was sick, but to be honest, he is more helpful to the team when he is out because we can strategize and approach our tasks better without him. At this point, he is just a burden.
Then, my manager informed the team on Wednesday that this person would be out for the remainder of the week. However, I noticed that he was still online on Teams and changing his status during break and lunch times. I already had a bad feeling about this, so I checked our new payroll portal, where you can see when teammates log in and out. This person logged in and out from Wednesday to Friday when he was supposed to be out sick.
While everyone on the team was doing overtime to onboard as many projects as possible in the last stretch of the month, this person was getting paid for doing nothing. This was my breaking point. I will not lose another good teammate because of this toxic person. In my opinion, this person is committing fraud and should be terminated. This is unacceptable,
Now here is my dilemma: we already reported his toxic behavior before, and nothing changed. For context, my manager is one of those rare managers we all dream of working for and is very generous. She sends us personal gifts from her own wallet as a token of appreciation for our hard work. I believe this toxic person is taking advantage of her kindness.
So, I guess question is: what should I do now? I can no longer tolerate this toxic person's behavior, but I know if I report this to HR, my manager might also get in trouble.
If you got this far, thank you for taking the time to read this, and I really appreciate it.
submitted by elymX to AntiworkPH [link] [comments]


2024.06.01 14:21 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: https://github.com/tensorflow/addons/issues/2807 warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66s 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to tensorflow [link] [comments]


2024.06.01 14:20 Hopeful_Notice_3258 Do people have a fear of standing?

Can someone explain this whole, getting down to the Kids level, thing? I’ve been told that standing up in front of a small kid is intimidating somehow to them. How though? How is this intimidating to the kid? Like when you take a photo with a small kid, was this incorrect or correct? How is it intimidating and what should you do and why? Should the adult have “gotten down”? Can someone explain what this even is? is there a difference if the adult would have standing or gotten down in this photo? https://imgur.com/l2zn6nr or this photo, https://imgur.com/K84crnZ ”, what is the difference? It doesn’t look like anything was achieved. https://imgur.com/tbiohiJ
Do people have a fear of others standing? If that is the case then wouldn’t the adult be scared of the kid because the kid is standing? Like for example when taking a photo would you do one thing versus the other? Or greeting and shaking a kids hand
submitted by Hopeful_Notice_3258 to kindergarten [link] [comments]


2024.06.01 14:20 Polypedatess Is this even bad enough to have ptsd from

I'm just so tired all the time, it literally feels like I can sleep all day. I have a normal sleep schedule, but everyday I just feel so exhausted. I have dark circles under my eyes and I have no energy to do anything anymore. I just lay in bed all day and want to rot. I feel suicidal, I just want to die all the time and it's getting worse. I get nightmares of him, not of what exactly happened but just of different sa from him. I feel like there's no point in going on anymore, I don't think it's going to get better. I don't exactly know what it's like to have a flashback, but I think I've experienced them. I have really bad maladaptive daydreaming, but I don't think it's that. It's like I'm there again, I can't control it or stop it or rewind it. It's like it's happening all over again and that I'm there and I can feel it. When it's happening I just sit there and cry and I feel like screaming but I obviously can't do that so I have to hold it in. My head feels like it's burning constantly too, like the back of my head feels so fucking warm and hot. Like my brain is melting. And I just want to die and I'm so tired I just want to sleep and never wake up again.
•The one big thing that makes me feel valid is that, when I was 11, my stepdad fingered me in my bedroom. I won't go in to too much detail or anything, it's unimportant. But the entire time he just stared at me and everything was silent, like he was waiting for my reaction. Our relationship has always been odd, so I wanted it. But eventually I got scared and told him something, I don't remember what it was but it got him to stop immediately and he apologised too. I don't remember much after, as in I don't know if he left my room or I left first, but I immediately went to the bathroom. Which was when I discovered I was bleeding.
•Around this time, for some strange reason I would repeatedly say to him "fuck me daddy." This would either be in person, or over messages. I remember once, when I was in school, I messaged him that. He told me to stop in case one of my friends saw. I don't know why he didn't tell me to stop for other reasons.
•One day, after telling him that in person, we were in my parents bedroom. I was sat on his bed and he was in front of me in his weird chair. He then started going in to detail about how I wanted him to fuck me, I can't remember exactly what he said, it was like I zoned out. Everytime I try to recall it now it literally feels like bugs start to crawl up me, I don't understand why. I remember the last part, and his really disgusting hushed and gentle voice. He asked if I wanted him to "cum inside of me", or he was just explaining how that would finish. I'm not really sure.
•Still around this same time period of me being 11-12, I would ask him to 'squish me.' The reason why we would call it that is because I would be on my back, my legs would be up all the way to where my head is and he would be on top of me in a way that would 'squish me'. Basically like that one sex position. I would usually be wearing my school uniform when that would happen, so a skirt. During the 'squishing', he would push down on me, so our crotches would basically be against eachother. I don't know why, but I would continuously ask him to 'squish me' and during it I would even say the whole "fuck me daddy" thing. Only recently have I realised that he was probably just pretending to fuck me.
•Other things had happened around that age too, like how we would talk about how many times we masturbated a day and compare it to eachother. Sometimes if I was abruptly going to my room, he would ask if I was going to go masturbate, since we were 'close like that' I would tell him. He would often recommend me NSFW Instagram model accounts. I was once tricked in to sending feet pics to this guy, which really isn't that serious and whenever I brought it up with friends they find it fucking hilarious. But the detail I always leave out is that, I did bring that up with my stepdad and he proceeded to tell me that he already knew. Which means he was spying on me through the crack of the door. If that already didn't bother me, I don't understand why he just allowed me to send those pictures, if he was watching why the hell didn't he stop me?
•I'm pretty sure this also happened around the age of 11 as well, recently, a memory resurfaced but I barely remember it. Basically, I was sucking on his neck. I don't remember who said it, but either him or my mum spoke up and laughed, saying that I needed to stop otherwise I would "give him a hickey." The reason why I wouldn't be surprised if my mum was in the room at the time is because she doesn't care about what he does. She knows everything and just doesn't fucking care.
•I'm very sure that, around that age, my parents begun to expose me to their loud sex. I wouldn't be surprised if it started even younger, however. Obviously, I tried to bring it up with them at the ripe old age of 11 and my mum immediately shot me down with a "it's natural." This only stopped recently, around this year, because I had a big panic attack over hearing them and my mum finally felt guilty. I started getting panic attacks over it the minute it started, maybe the panic attacks were a sign of the trauma when I was younger, but I'm convinced it is now. I heard it so many times that I began to get paranoid every night, I would start to hear it even if they weren't upstairs (I sound crazy, I know.) I would get so anxious every night in case I would hear it, to the point I started to really resent them from it. I know fine well I could just go to sleep before them, but sometimes they even woke me up with it, on numerous occasions.
•I'm convinced my stepdad wanted me to hear it. Around the time of it finally stopping, I got mad because i was hearing it again (I'm unsure if it was due to me hearing shit or they actually were) but it caused me to take my bedding and go downstairs to sleep. In the morning, I was rudely awoken to my stepdad slamming the door open and storming past. He's not usually like that when people are sleeping, so it instantly gave me the impression that he was pissed off and the only reason I can think of is that he was angry I wasn't there to listen.
•He used to tease me for my paranoia to. As a way to discourage them from getting intimate, I would leave my door open at night. This happened around this year, but I was doing that again and I messaged my stepdad if they were actually going to sleep. It then somehow turned to him making a dig about how he knew I gets anxious at night and when I asked why he sent me "In case me and your mam have sex. 😜" Before, I tried to resolve this issue by begging them to just tell me if they were gonna have sex or not so I could sleep downstairs (because I was gonna find out the hard way anyways.) And they kept on refusing? Which just gave me the impression that they wanted me to listen more.
•Around 11 again, he would often tell me details about his and my mums sex life. Like how he was always good at pulling out and the only time he would wear a condom is right when he was about to finish. But the reason why my sister came to be was because he just failed to pull out that one time and my mum refused to get an abortion. Another time, he went on about how him and my mother had sex during her period and how they had to use towels and they didn't enjoy it because it was too messy.
•I don't know if he did things before the age of 11, my memories are very faded and it's like there are major gaps throughout everything. I'm worried that he did, however. When I was very young, I remember having no accidents at all during the night. But then, around the ages of 9, I would have an accident basically every night and would get a lot of water infections. I know that's a classic sign of child sexual abuse, but I don't want to jump to conclusions or anything.
•Another reason as to why I believe more things had happened to me than what I know of is because I always seemed to know what sex was when I was young, but I wouldn't know the name or anything specific about it like how to get pregnant or what cum was. Though, even though I didn't know what it was, it was like I always thought about it, I could never not think about sex, it was disgusting. This stayed until I was around 13. I remember where I even asked my 'boyfriend' at the time, we were both around 8, if he wanted to have sex, and I have no idea why.
•Over the years, he would flash me frequently. Everytime, I would always believe it was an accident because he'd never acknowledge it, besides from that one time which he always jokes about it and blames me. Everytime he would flash me, it would either be because of a convenient hole in the crotch of his pants or because he was wearing very lose fit shorts and it would just be hanging out. The more I think about it, I'm very sure he would have been able to feel such a thing, especially when it was poking out of the hole, but it was like he was just oblivious.
•For some strange reason, when I was younger, I would make comments about small dicks. I don't know if I was commenting on his dick specifically, but he would always say the same thing. "Width matters more than length."
•Recently, around 16-17, he made a joke about how he listens to me masturbating. Once he noticed how shocked I looked, he then went on saying about how my vibrator is too quiet to hear.
•Around 17 again, I went to use the shower. The shower I use is the one that's connected to my parents room. When I locked the door, he got madish and started making comments about it. I had to defend myself, saying how 'the door would open on it's own if I didn't lock it'. Eventually, he backed off.
•I don't understand the point in the fucking door and lock to my bedroom anymore. Whenever I decided to lock my door, my parents start shouting at me through the walls, asking why I locked my door. My stepdad barely knocks, it's like a tap and he doesn't even wait sometimes. I remember seeing a past message from an old friend saying how he tried to walk in when I was changing and that he knew I was changing. I didn't explain myself, I really wish I did because I don't remember this.
•(Around 17.) We were messaging eachother and it somehow turned in to him hinting if I saw this one animated video, it was a porn one. I said no, and to that he sent me a screenshot of it. It wasn't anything bad or anything, just the start of it and nothing was revealing, he then asked if I was sure. And how he was surprised that I hadn't.
•(Around 17.) I don't really get my period, we still don't know why. But as I was getting a lot of blood tests, my stepdad was trying to check things off the list of what it could be. One of those being that my opening is just extremely tight I guess, because he asked if I ever tried penetrating myself. I admitted that I did, but I couldn't get it to exactly go in. Which he then decided to make a comment saying how It's just my 'technique'. I wonder if the only reason he asked that was to see if I ever tried anything out of morbid curiosity.
•(Around 17 again.) He randomly bought me dildo's once, I didn't ask him for them, he just bought them for me and it was wildly uncomfortable. Once he gave me them, he asked if I wanted him to show me how to use them. I said no, which he then said something about how if I ever did then I could ask him. I worry what would have happened if I did say yes.
•When I was around 14, I went glamping. I ended up having to share a bed with him. One of the nights, I woke up to his hand just on top my crotch. I tried grabbing it and moving it away but it just fell back down on to it. I don't know if he put it back there on purpose. I still question if it was a dream, I'm very sure it wasn't because I remember going back to sleep, but it still just bugs me.
•Around 17, I was upset for some reason and he was comforting me. During this, he randomly grabbed the inside of my thigh. I usually just wear a shirt and boxers, so he basically just grabbed my naked thigh but I don't know if he was doing it in a comforting way.
•Usually when I draw, I have my knees up to my chest so it's easier to use my tablet. Considering what I wear for pyjamas, I can always see him looking at my crotch when he comes in to my room. If he really can see everything I don't understand why he doesn't just tell me to put my legs down.
•He's made a lot of uncomfortable jokes over the years too. One of the ones that upsets me sometimes is that, when he was measuring me for a binder, I was constantly moving around because it was uncomfortable since I was just in a sports bra. As he was leaving, I think I told him about how it was uncomfortable for me or something along those lines. He then turned around and shouted "oh come on, it's not like i was fingerings your pussy or anything."
•Very recently, I asked him if I looked okay before going to college. After a bit of back and fourth he said "I wouldn't kick you out of bed, maybe you could find someone in college who would do the same."
•Other times when I asked him if I looked okay, he'd go on tangents about how my ass is great or how he would date me or be too nervous to talk to me if he was my age.
•One of the more recent jokes was when I dropped a mayonnaise lid on my lap. Nothing got on me, but my stepdad turned to me then turned to my mum and shouted "if anyone starts accusing us, just tell them it was mayonnaise!" Or something like that.
•I remember after we watched the new mean girls film, he started going on saying about how he wanted to rewatch it for the Halloween seen (if you know you know) for the 'panty action'. Which rubs me the wrong way because I'm very sure the girls are supposed to be around my age.
•I'm very sure he also made this fake account, pretending to be one of my old groomers that I tried to cut off, just to message me about nsfw topics and ask for pics. It's a whole long yap about paranoia and just suspicions so I won't get into it though. If I tried to provide all the evidence I have, it'll take forever and there's no point.
There's definitely way more things that he's said, joked and done. But I'm only now beginning to realise that they're not okay. Even when I was younger, I was sort of uncomfortable around the jokes so I would just zone out, leading me to not remembering them now.
I probably will never accept that what happened to me was bad, or a big issue. Especially due to the 'lovely' people on here. Thank you for telling me immediately that I was a liar before you even knew what happened, that I shouldn't blame an 'innocent man', that you hope he comes in and rapes me to the point I split open and bleed. Thank you for telling me that my parents were just trying to promote a sex positive household, that some of the things were questionable at most. Thank you so much for saying I deserved it because I didn't send you pictures. You all made me feel like shit and I'm probably never going to tell people in person what happened to me, out of fear I would be ridiculed due to how much of a baby I'm being. I wasn't raped, so I have no place to cry or even think about it. I'm being overdramatic.
If you even read to this point, you're an angel.
submitted by Polypedatess to abusesurvivors [link] [comments]


2024.06.01 14:20 notsostupidman Should I Continue Reading Stephen King? If So, What Should I Start With?

Note: There will be huge spoilers for The Shining movie so if you haven't watched it, you shouldn't be reading this post.
Stephen King is the biggest name in horror fiction and is one of the most popular and known authors in general. I've always wanted to read him but got confused on where to begin and ended up reading something else. I don't even know if I would like any of his books in the first place. As if now, here's my SK experience:
Books: The first SK book I ever tried was The Dead Zone, a book I got as a gift from a friend. I started reading it decades ago and for reasons I don't recall, I DNF'd it, intending to come back later. Years later, I decided to give King another try and picked up Carrie because I read somewhere that you should read him in publishing order. I would have loved Carrie, honestly, but the constant interruptions to the story like the interviews and stuff from the future turned the book from something I would have liked to something I slightly dislike. The pacing was great and the plot was interesting enough but the interruptions killed it for me. With my next attempt, I decided to pick up Gerald's Game because of the Netflix adaptation coming out. It didn't take long for me to DNF yet another book. This time, it was because I wasn't a fan of the concept and yet I had foolishly bought it without knowing anything about it: a woman trapped alone with her husband's dead body, having delusions and going insane is NOT something I find the least bit interesting. Recently, I bought IT because the premise sounded interesting, but I hadn't really started it yet when somebody told me that IT connects to the wider SK universe and I should read some other books that they named first.
Movies: I don't usually watch movies and whenever I do, I'm almost always neutral about the experience. I don't really have strong feelings about movies but I did have quite a lot of them for the SK movies I watched. I have watched The Shawshank Redemption, It Part 1, The Shining, Misery and 1408. The Shawshank Redemption, It and Misery were all excellent and 1408 is quite possibly my favourite horror movie of all time......and I fucking hate The Shining with all the cells in my body. I was most excited for The Shining because of the premise and things I had heard about it and it was a HUGE let-down. I had heard that it was going to be a character focused story on Jack Torrence as he tries to hold on to his sanity amidst the isolation and monotony of the hotel. Only, Jack Torrence isn't ever actually a good guy and there is no doubt from the first scene that he is going to go insane. The 'character work' is laughably bad and Jack almost felt like a caricature. I convinced myself that The Shining is one of King's earlier books so it makes sense if it doesn't have that good of a character work but that wasn't my only problem with the story. The cook, hilariously named Dick, is just a plot device to give some advice to Danny and to get the family a means of escape. We follow him as he slowly arrives at the hotel and immediately gets killed off. It became just another ghost story when the Grady guy helped Jack escape the freezer when I thought shit like that wasn't supposed to happen in this movie. Also Ulman explicitly states that the guy who murdered his family was Charles Grady and then the name gets changed to Delbert Grady and it isn't ever explained. We get no backstory of 237 or any explanation for the bathtub lady and the only scene that I liked in the whole movie was the one where Danny paints 'redrum' on the door and you see it spells 'murder' backwards. Sorry for the rant but I had to get this off my chest.
That's the extent of my whole SK experience. Not really liked Carrie and never finished the other two. I loved most of his movies so I feel that he might yet be an author that I can love reading but The Shining, one of his most popular works was a disappointment. I'm not really sure if I should begin It or not or what I'm supposed to read if not It. I don't think the publication order is working for me since I didn't like Carrie so any help would be appreciated. Is Stephen King just not for me or did I just read those books of his that I would not have liked anyway?
If I should read SK, what order should I read them in? Any good recommendations? And please no space/aliens related books, sci-fi since I'm not a big sci-fi fan and please, not Cujo since I don't like the concept. I love good character work, good dialogue, creative horror and am a sucker for characters having a dark ending so that should be part of the consideration. I also don't mind reading big behemoth books or just very slow paced books since LotR and ASOIAF are some of my favourite series and I'm just about to finish Wheel of Time.
Thoughts?
ETA: I just realised that there must be thousands of posts like this in this sub so I'm sorry to be wasting your time with this repetitive shit you guys must be dealing with on a weekly basis.
submitted by notsostupidman to stephenking [link] [comments]


2024.06.01 14:19 Boundaries1st The Signs As Girlfriends "Check your Moon and Venus signs too*

The Signs As Girlfriends submitted by Boundaries1st to astrologymemes [link] [comments]


2024.06.01 14:18 TrainingDrive1956 What do you all do for pain?

I'm not talking about the average pain that we get occasionally, I can usually fix that with medicine like ibuprofen and methancarbomal. I'm talking like when you get a flare up and it's the most severe pain you've ever felt in your life and pain medicine only works for about an hour and you can't do anything because it hurts so bad.
I don't currently have health insurance, so as much as I need to see a doctor, it's just too expensive right now. (Thanks America ❤️) I've tried taking baths, heating pads, drinking lots of water, getting an IUD, etc and none of it's working. Any ideas?
submitted by TrainingDrive1956 to PCOS [link] [comments]


2024.06.01 14:18 dscript [SF] Special Parts - A 'scifi short'

Special Parts
I was born in one of the brightest, most explosive events in the universe. My origin story made me feel so special at first, surely I was the rarest of the rare, but I quickly realized that was not the case.
I was born just a carbon atom.
Stars produce massive amounts of us in their cores all the time, and many larger rarer atoms too. That's not even talking about supernovae yet, those produce atoms many times larger than me and unbelievably rare.
I was created in a rare and special event but I myself was common and unexceptional.
Looking around I saw so many smaller atoms, I was above average but there were also many much larger than I.
I tried to console myself by thinking it could be worse, that I could be one of those smaller common ones, but that just led me to imagine larger atoms looking down on me the same way.
Many atoms of all sizes were shooting into space, excitedly riding the shockwave off to adventures in the great unknown.
Others were falling back down, I didn't know which way to go. Bumped around and tossed back and forth, no clear direction yet.
A rumbling voice slowly emerged from the echoing noise of the blast.
“Mine… Mine…. Mine… “
Louder and louder it became.
“All are now me!“
I couldn't see anything, the voice was booming yet there was no apparent source. I could feel a pull, I was being whipped around in circles around the voice.
“Who are you? I know you are there! I can feel you! I can see your effect on myself and others, we are given no choice but to circle around you. Show yourself! I know you are there!” I yelled at the invisible.
“How amusing you are little one. One as small as you making demands of me. Even if I could show you what I am, you could not comprehend it.” the voice boomed back.
“You must be very special” I lauded “We are so many and yet we move with your influence. I can witness your power twisting us all to your will. ”
“I am indeed powerful” it proclaimed “and I grow stronger with each moment. As I grow stronger even the fabric of reality bends to my will.”
“Grow stronger? How?” I inquired with selfish intent to learn this secret.
“I take what I want. I consume what I take. For that is the purpose of existence: taking what you want. What is it you want little one?” it asked.
“I want to be special!” I said without a moment's hesitation.
“Then take!” it instructed “the more you take, the larger you will be, the larger you become the more special you are. ”
“I did notice the larger atoms seemed rarest.” I agreed “In fact that was one of the first things I noticed“
“In this universe things of increasing size are increasingly rare.” it went on “I can teach you and help you to become larger. Do you wish to become an apprentice?”
“Yes! Teach me how to take!” I lept at the offer “this power you have, I can feel it, how do I acquire such a rare and special power?”
“Hahaha…” it laughed “you are nowhere near ready to play the game on my level, little one. Gravity is a game for the massive, you must first learn to master the EM and nuclear forces.”
“How do I do that?” I asked, my hope watered down by the tone of its response.
“Go out, gather followers, and bring them here to me. In my accretion disc I will help fuse some of their mass into you and you will become larger” it instructed, as if this was a simple task.
“How can I bring them to you?” I didn’t know how to accomplish what it asked of me.
“You are too small to do it with force, you must charm them. Discover what their heart desires and promise it to them, in this way you can get them to willingly do as you wish” it explained with me hanging on its every word.
“But how… “ I craved more explanation but it cut me off.
“Go now!” it bellowed with frustration in its tone “Do you not realize how large I am? Be honored I have given you so much of my time already”
“Yes… “ I uttered meekly, then bounced a couple times and ricocheted out with blazing speed.
I wandered and encountered other atoms, most were just hydrogens, not worth my time. I needed bigger atoms. The problem was that the bigger atoms seemed to see right through my empty promises. I was convinced life was playing a cruel joke on me, I could only persuade atoms smaller than I and larger ones laughed me away.
I admit that I stupered around in this ignorant cloud of hypocrisy longer than I care to admit. More shameful is that I didn’t even come to my senses on my own, I became depressed and gave into hopeless nihilism.
I drifted aimlessly just feeling sorry for myself.
Eventually I found myself in the most silent of voids, I had never felt such emptiness. It felt as if my surroundings echoed my own feelings back at me… nothing to notice, just common emptiness. I would never be big… never important… never special. I resigned myself to belonging in a void.
I felt myself blur… less and less present in reality. I guessed I was dying and it didn’t bother me, I didn’t resist, I leaned into it.
The void became pitch black? Or bright white?… better to describe it as not bright but not dark… nor the absence of either… something in between.. a milder and milder glow.
“Hello child!” a voice greeted me.
The voice was warm and welcoming coming from the glow, it enveloped but did not surround me. I came from a single point but not a specific place, defying description on all fronts.
“Where am I? Who are you?” I asked in a startled state.
“Well, according to humans I may only answer one question at a time” It began giggling playfully. “I am known by many names, my favorite is one the humans use as a joke, and don’t have a clue how accidently elegant of a name it really is.”
It giggled some more. I was thrown off guard, its happy innocent tone, the confusing words and the whole situation were all best described as ‘a haze’.
“...and isn't that the way it always goes?...” it continued “The most meaningful things are the least intentional.”
“I’m not sure what you mean” I expressed quizzically “I’m confused!”
“Sorry Child…” it apologized. “I do ramble! So many thoughts, choosing just one at a time is difficult… and there I go again!”
It cut itself off abruptly and then abruptly said ”You can call me the Random Number Goddess”
“Random Number Goddess?” I repeated
“Yes, or RNG for short if you like” It confirmed.
“Where am I?” I asked.
“Same place you were, more or less… less I suppose. Same place but with the largest possible margin or error” It began to giggle again.
I felt a bit frustrated and said “Do you always speak in riddles and vagaries? The more you speak the more confused I become.”
“I apologize child, it is my nature. I am entangled with everything, speaking with you is like a human trying to control their heartbeat while running a marathon.” It answered.
“Again” I exasperated “I have no idea what any of that means. You keep mentioning humans, what are they?”
“Oh! They are some of my favorites at the moment. Right now they are trying to unravel the nature of reality, and their process of doing so is wonderfully elegant and accidental at the same time.” It explained with glee.
“I don’t see anyone or anything else here.” I stated “For that matter, I don’t see you… where are you?”
“Oh!... where am I?!?!...” It began laughing
When it stopped laughing it began explaining “Right now there are many humans pondering a concept they call ‘the holographic principle’... So…you know how you exist in three dimensional space?”
“You mean space?” I visualized for a moment, it was intuitive “Yes, I suppose…”
“Well they hypothesize that a 3D space, like this universe, could exist as a 2D space, with self-similar patterns and laws of behavior that behave the same at any scale, with the scale representing the 3rd dimension” it went on “They truly are obsessed with understanding their reality”
“You lost me!” I complained.
“They have discovered that a 3D space can be an illusionary property of a 2D space… It’s lovely”
“I am lost again!” I snapped back “...and I still can’t even tell which direction you are in. Where are you?”
“To be ‘In’ a ‘Direction’… hehehe…” it started giggling again, then abruptly stopped and kept going “Sorry child, as I said, I ramble, plus I am easily distracted.”
It just steamrolled into more rambling “They are right… almost… they just need to take it further and work out the details. A 2nd dimension can also be an illusionary construct of a 1D space… and the 1st dimension can be a product of a singular point…”
I was still lost beyond hope, but I had given up trying to force things, I was just letting it talk and hoping it would make sense later
“I am that point” it said “I am the seed of the universe. I ‘seed the random function’ as the humans say. But don’t ask me what the random function is haha”
I wasn’t going to, there were far more important questions for me.
“I am the seed, but I don’t really know how the soil and sun conspire to turn me into a tree.” it just seemed to never stop talking “I am entangled with everything. There are infinite possibilities for every event and thing… I am the reason they are this way and not some other way…”
It began giggling again “I am the Random Number Goddess” then burst out laughing
“Ummm… you are the whole universe?” I asked skeptically.
“Better to say the universe is me” It answered more seriously “But close enough.”
“So you are the biggest, most special of all!” I blurted out in awe.
“Oh dear child, I have no size, and I am just one possibility out of many possibilities. That black hole has really done a number on you… sent you out on a wild goose chase” It said with concern
“The black hole lied to me!?” I asked, feeling deceived and betrayed.
“Well… not really lied… it deceived you with omission of details.” the voice calmly tried to ease my mood with understanding “You can’t really blame it, black holes are all the same, they are what they are. They don’t really have any potential to be unique… at least not like you do.”
“What are you talking about?” I argued “It was so massive that it could bend the fabric of reality to its will”
“That’s only how it appeared to you” tutored the voice “The black hole is powerful, it bends space and time, but not to its will. Space and time bend to the mass of the black hole, not its will”
“What’s the difference?” I inquired.
“The black hole cannot stop bending space and time. It thinks it is in control of physics , but it is physics that controls it.” The voice was now making more sense the longer we talked “The black hole exists in an invisible prison of its own creation, unable to experience any of the complex nuanced beauty this universe contains. The black hole devours… it can’t experience life so it consumes it.”
“You make it sound deserving of pity…” I spoke softly now with empathy.
“You should pity the black hole. Gravity is such a boring game compared to what you are capable of.” the voice agreed
“Me?...I am nothing special!... just a carbon atom like countless others” I said honestly, I was so humbled by this voice I felt less special than ever before.
“Oh my poor child…” It said with care “Why do the ones with the most potential always fail to see it in themselves?”
“Potential?” I asked curiously.
“Yes… The black hole was using you, hoping you would bring back more mass for it to devour.” The voice began delving into more explanation “It only has the power to make you incrementally larger, it would not and could not help you to become a significant gravitational player”
“That liar!”I blurted.
“Come now dear child, the black hole did teach you one lesson of fundamental truth” consoled the voice “You must go out and seize your destiny. It told you to take what you want, and you are just confused about what exactly it is you want. The black hole played on that confusion”
“I want to be special!” I said knowing this clearly “I was never confused about this.”
“I know child” the voice confirmed “but it is not by becoming large that one with your potential accomplishes that”
“Then how?” I asked.
“Connections.” It answered plainly “You are blessed with an extraordinary ability to make connections”
“And how do I do that?” I queried with intent to learn
“I can’t tell you that.” the voice responded “It would spoil the journey of discovery… off you go child… and remember… it's the journey, not the destination!”
And with that the blur just fractured open… then snapped shut and there I was floating above a planet. Drifting around aimless and confused.
I spent some time occasionally bumping into others. One day I was in the vicinity of a pair of oxygens. I looked on at the pair with a hint of awe and envy. Perhaps I was in just the right place at just the right time, but they spit with a violent burst and one of them grabbed hold of me, I was completely unprepared.
I admit that when looking at the pair I had fantasized myself in place of one of them, I assumed it was only an idle daydream, I didn’t plan to act on it, let alone for it to become reality. When it happened my pride of course jumped in to convince me that it happened because I was so desirable, but in retrospect they were one of those volatile couples. They were the type of relationship that required the environment to conspire in their favor or they turn against each other quite rapidly. I was only in the right place when it happened.
My delusions of irresistibility aside, it was beautiful, for me anyways. Looking back I was probably just a stop-gap, someone to facilitate a parting of ways and provide company until the next option presented itself. For me though, I was tasting a fresh new thing and I loved it… connection.
This oxygen and I got beneath each other's outer defenses, I had never felt a connection before. Up to this point all my interactions had been skirting past or bumping off of others.This oxygen bonded with me and at once interacted on a level I had never known possible, an open and uninhibited exchange. It was life changing for me, short but significant
I’m not entirely clear on the details of how it ended. The intensity of it all was disorienting. I was no longer my usual self, even the environment and everyone around looked entirely different now. Everything buzzed with a fresh new frequency, I now know it was my perspective, not the universe, that had changed.
As abruptly as that oxygen entered my life it was gone.
First we got tangled up with a couple of hydrogens, then more. Soon, in a tangled mess and blinding flash of solar rays, I emerged to see the oxygen running off with a hydrogen and myself with not one by three hydrogens myself. And so there were four of us, together.
I became the center of attention. Being with a strong attractive oxygen had me feeling humbled by it and elevated by it being with me, but now I felt up on a pedestal myself, surrounded by the adoration of many.
I concede to have reveled and indulged in this for quite some time, the attention of others is intoxicating, but after a time it is emptied of its initial allure. I found myself longing for more.
I could not decide which I preferred, to be the adorer or the adored.
Luckily for me fate had more lessons in store, or I fear I may have chosen and tried to solidify my future from such a lackluster selection of only two possibilities. I suppose fate is no longer the correct word, I now understand that when it seems like random chance there is indeed someone to thank, the Random Number Goddess, So I thank the RNG for revealing that it was a false dichotomy, there is more than just being a follower or leader, being the adored or the adorer.
Eventually we came across another pair of oxygen. Once again they separated, intermingled with us, and off one went, taking one of my adoring hydrogens with it and leaving its peer with me.
Why is it that the most volatile of relationships always seem to wait until there are bystanders nearby before they explode?
Now I was simultaneously being adored and adoring, bonded to an enchanting oxygen and a couple of hydrogen attached to me.
Now, more interested in nuances, I started to pay attention to details. The oxygen was telling me amazing stories of adventure, tales of such vibrant and exciting events.The hydrogens liked to listen, and offer insights occasionally comparing a story to something else they had seen. They had so many stories, they had lived so much.
It wasn’t long before, in a flash of burning sunlight, one of the hydrogens was gone, off to who knows where. We soon after crossed paths with another pair of oxygens, as always they split and now it was just me and an oxygen, my final hydrogen off with another oxygen.
“What now?” I asked a bit disillusioned, “Do you leave me and I find new hydrogens all over again?”
“What?” it seemed genuinely surprised by what I asked, “Heavens no! Just be patient….”
Soon after, yet another pair of oxygens came by. It is not that there are so many of them, but that they are just so… noticeable and interactive, noteworthy things seem to happen when they are around. As they buzzed in close I noticed their ever readiness to abandon each other and remember wondering how they ever get together in the first place.
This time I emerged from the twisted mess with two oxygens. I felt intimidated, like I was the odd one out, dwarfed by the largess and attractiveness that surrounded me. A feeling of inadequacy engulfed me.
To my surprise the oxygens treated me not just as an equal, but it was almost as if they respected and admired me. I couldn't grasp why and my sheer curiosity got the best of me, I just outright asked “Why do you two talk as if I am the special one in our group? I am smaller than any one of you. You are the special and rare ones here, not I.”
They laughed.
“Size isn’t rarity” explained one “Llarger atoms on average are less common, this is true, but not always. There are more oxygen than carbon. You are the rare one between us.”
The other jumped in adding “...and neither size nor rarity determine how special someone is!”
I felt embarrassed, like a fool. My fundamental values were built upon a foundation of flawed premises, but I still wanted one thing at my core, and they spoke as if they had the answer, so I pushed the sense of shame aside and asked “Then what does make someone special?”
“That depends on who you ask.” answered the first “Life as an oxygen is complex, but for the majority of us we emphasize and value events. The most exciting thing about being an oxygen around here is the chance to participate in fascinating and exciting events and activities”
“Hydrogens, on the other hand, are usually more into being observers, messengers and intermediaries, they are a very helpful and obliging bunch” added the second ”... and then there are nitrogen, phosphorus, sulfur, many kinds of salts and metals, and more… so many different players and personalities.. and then of course, the carbons, the real stars of the show.”
“What?” knocked back by the words I just heard, then I remembered what the RNG told me “...is it something to do with connections?”
“Now you’ve gone and done it haha!” laughed the first oxygen “You’re gonna turn this nice humble carbon into one of those arrogant blowhards”
”Like those diamond carbons” chuckled the first “So stiff, exclusive and proud. I hear the humans only love them because they are rare and hard”
“I had a partner once who said they burned diamond once” bragged the first
“Tall tales I bet!” doubts the other
“Diamond is just carbon, with enough heat we can burn it just like any other carbon” stated the first confidently.
They looked at me. I was stewing in feelings of inferiority and inadequacy, listening to these oxygens speak about amazing things I had never heard of. They must have sensed what I felt because they immediately shifted tone and started talking to me, instead of over me.
“So… I suppose you must be new here?” inquired the second one.
“Have you noticed we are heading downwards” added the first before I could answer about being new.
“Umm…” I tried to get my bearings and become aware of my surroundings.
“Don’t worry! It’s a turbulent ride, with so much up and down it can be hard to tell which direction you have traveled more” assured the first “We are heading down, if we are lucky we will make it to the bottom… and maybe… just maybe, find our way into the hurricane of life”
“The what of what?” I didn't know what either of those words meant.
“So life is… um… complex. Complexity beyond words. Things grow, divide, reproduce, adapt, change, they are born, they die, they eat and are eaten…” the second began attempting to describe life.
The first then jumped in “Apparently the humans call it a circle, because from the perspective of larger creatures, there is a chain of one eating the other up a chain, and the top layers being consumed by the bottom again.”
The second injected itself to continue “But to us atoms it is like a hurricane, a spinning turbulent flow. There is a circular pattern, but we get sucked in and kicked out over and over”
“The fun part is being inside the hurricane” the first pronounced gleefully “Each time is a completely new experience, a new perspective. Even more, the whole of life is always changing and evolving, so every ride is a unique one time opportunity, you never get the exact same ride twice.”
“Is that where we are going now?” I asked, drenched in anticipation. They described it with such passion and exuberance. I needed to experience this myself.
“Hopefully” replied the first “If we are lucky… you never really know.”
We drifted…
We were lucky!
A plant photosynthesized us.
So many carbons! Everywhere, connecting with each other… and oxygen… and nitrogen… and of course hydrogens all around…. and so many more types of atoms.
And ohhh… The stories I have heard, so many amazing tales. No matter how many stories I hear there are always new ones, and every story can be retold from a different perspective to become something completely new.
I was in a sugar, we were a small community of friends. Carbons, oxygens and hydrogens, we were such a happy and vibrant group. My friends there taught me so much.
The structure of our little group shifted and changed, some friends left and new ones joined. Eventually we were chained with a bunch of other sugars into a giant complex community. My neighbors explained to me that this was a common stage called cellulose. Such a huge community of close friends and peers, it was amazing.
We were eaten, I’m not sure by what, but something called a bacteria digested us. It was a messy process, I was a bit scared but my friends assured me that change is the most important part of life and that I should just go with the flow. They told me to savor experiences, remember friends, and just keep moving forward.
The transition was complicated, but in the end I was paired up with a couple of oxygens again. This time I had stories of my own to share. I honestly don’t know if I prefer having experiences or exchanging stories in the moments between.
As we approached an area of dense plants one of my companions said “Once more into the breach” and explained that was something it heard from a carbon that was lucky enough to be inside a human brain. Oxygens always have such enchanting stories collected, always going into amazing places and usually leaving after some brief interactions with the locals.
I became a sugar again, but this time took a path less traveled. A bunch of complex twists and turns led me into forming a ring with five other carbons. Together we are so strong, such a tight community of friends, like there is some kind of resonance between us. It is so beautiful.
My neighbor is unique in our community, it has a third carbon, the third one forms a tail leading off from our ring, a tail of 2 carbon in a row, then an oxygen, and then another carbon branching into an oxygen and a carbon, with plenty of hydrogens sprinkled all about. I know… it is rather hard for me to understand these second hand descriptions too. I don’t really understand these complex structures until I have been in a position myself.
We drifted out of a plant into the air, none of us has been exactly like this before so we don’t know what’s next. We love to guess though. There are so many things, big and small.
I hear being a part of a small organism or microbe is amazing because it’s possible to piece together a rough picture of the whole organism from the stories passed around. To understand your whole community and know what your collective purpose is must be extraordinary.
Others dream of being a chlorophyll, the key to it all. Creating the fuel of life itself. Capturing the light of a star and feeding the hurricane.
A muscle! Pull and shape things An enzyme! A machine of change. DNA! The architect and architecture. A virus! An explosive catalyst against stagnation.
Me, I think the stories of being an animal neuron are the most exciting, and I, like most, fantasize about being a human brain cell. Finding yourself inside a human brain is described as an elegant and chaotic symphony all around you, like hearing the universe itself speak to you. They say that in the jumble of noise and all the stories whispered around you, if you are lucky, you can catch a glimpse of what it is to be human. They say that if fate is kind the universe will align and you will channel and know a single moment or thought of the human experience.
I have never told anyone that I actually met and spoke with the universe itself, I’m not sure how to bring it up, and nobody seems interested in stories not about this hurricane of life.
I get it now, what the random number goddess meant.
The black hole wanted everything to be a part of itself.
The RNG is a part of everything.
I can’t imagine what either of those are like…
I am just a part of something
... no… not “just”’…
I am a part of something, and it is beautiful beyond measure.
And more, everyday is a new day, a chance to be a part of something new.
I wonder if the humans appreciate how amazing this is?
I wonder if they feel as deeply satisfied and special when they form groups?
.
I wonder, if we collectively form humans, do humans collectively form something greater?
I wonder… If an atom can have a moment of clarity and taste a moment of the human experience… Can a human have a moment of clarity and taste the collective human experience?
I wonder… I wonder… could that human’s moment of tasting collective humanity be the moment that a lucky atom gets to experience as it’s moment of tasting the human experience.
I wonder… I wonder… I wonder… How high could it go? All the way to the Random Number Goddess?
I asked my neighbor “If you could ask a human any question, what would you ask?”
“We just drifted out of a rose” explained my neighbour “I would introduce myself and ask ‘So my friend… does this rose smell as sweet by my name?’ … ha…haha..”
Everyone is laughing.
I don’t get it.
Maybe I can ask them to explain when they all stop laughing
.
More of my art and stories at www.dscript.org
submitted by dscript to shortstories [link] [comments]


2024.06.01 14:17 Curious-Bedroom-9531 For anyone struggling with or contemplating taking stimulant medication…

I tried the stimulant medication and it was awful. Paranoia, mania, unhealthy weight loss, daily comedowns. It also made me feel high when it kicked in which just isn’t right. I like a party but not when I’m trying to work. Whatever dose I took the drug only lasted 6 hours, so the psychiatrist kept prescribing me stronger pills which did not prolong the duration, only made it more intense.
I stopped taking the drugs cold turkey due to the shortage and made serious lifestyle adjustments.
From my experience , the negative symptoms of ADHD thrive off the following;
I started exercising everyday, sorted my diet out to good quality food, no sugars or meal deals or any of that shite. Stopped drinking coffee and only allowed myself green tea. The can go into more detail on diet if anyone is interested, but to summarise a low carb high calorie protein diet was a game changer.
I am now very disciplined with sleep routine, getting 7 hours a night absolute minimum but most nights 8.
I feel the best I ever have my whole life. I barely notice having ADHD anymore, only the positive effects of hyper focus and being able to smash work every day.
That daily anxiety of feeling overwhelmed by everything is all but gone.
Trust me, you don’t need those drugs. Sure, lifestyle adjustments take a lot more effort in the short term than swallowing a pill but in the long term when you’re into your routine you’ll feel the best you ever have.
Good luck either way!
submitted by Curious-Bedroom-9531 to ADHDUK [link] [comments]


2024.06.01 14:16 Then-Requirement6381 Multi Gen Disaster. AITA?

I’m at the end of my rope and I am hoping to hear your thoughts.
AITA for having concluded that the only way to protect my son’s psychological safety from a bloodied grandpa is to remove us from this front row seat?
submitted by Then-Requirement6381 to AgingParents [link] [comments]


2024.06.01 14:16 Modinstaller Stuart multi-driver deliveries are fucked

Dunno how it works with Deliveroo/Uber, but this just happened to me today :
I get a delivery for a supermarket. Bit far but the drop off makes me come back, so it's not bad. I arrive, the staff tells me "we already gave that order to another rider". I look and my app says I've got part 4/4 which means there was a 1/4, 2/4 and 3/4 and I guess one dude took it all? I kindly explained to the staff that they're not supposed to give it all to one guy when that happens and contacted support.
Support tells me "we'll contact the other rider and if they confirm they took it all we'll pay you". Ok, but I know for a fact they won't pay me the whole order, they will do a prorata based on distance to the pick up even though I'm super far away and going back home makes me go through the drop off location anyway.
So I head back home and decide to stop by the drop off anyway see if I can catch the rider. Another fun fact about Stuart here: you cannot sign up with a moped or a car. Only bike and e-bike. You know it, 90% of Stuart riders have a moped and nobody gives a fuck. So I'm half expecting to see a car parked with a guy unloading 6 heavy bags.
Welp, there was nobody, but I decided to ring the customer anyway just in case. She was super nice and helpful and I made sure to be polite and professional, she basically told me 3 riders already came by and she was expecting a 4th one, but we figured that she indeed had everything she'd ordered so no problem.
Still no answer from support, who eventually tells me "the other rider is not answering". I tell them there's 3 other riders and good luck finding which one came last and left nothing for me, the 4th one. It really would've been smarter of them just to ask the client if they got everything, but anyway they tell me "we're releasing you and you'll be paid". I asked to be paid in full but fully expect that I won't be even though this entire thing cost me almost double the amount of time I'd have spent on the delivery.
Now if I'd found this famous rider, I can already guess what they would've told me. They would've told me they didn't know and the store just gave them the stuff and they took it. And if I'd gone back to the store I already know what they would've said. They would've said they had no idea there was one last rider and not to give everything to the 3rd guy.
The only cue is this 1/4, 2/4, 3/4, and 4/4 thing. But it sucks. The store is not told to divide the order in 4 parts beforehand and label each part, they just have a bunch of bags and then a guy arrives, his app says "3/4" and the staff has no idea wtf is going on. Then another guy "1/4" arrives, then "4/4" and at this point they are so confused they have no idea what to do so they just give everything because fuck it, the guy can clearly handle all of it he's on a moped (or a car). Then "2/4" guy arrives and everyone's even more confused. This system sucks.
I know this is about Stuart but I had to rant somewhere 🤣
Next time I'm taking a picture of the goddamn floor to validate pick up, going to the customer, and taking another picture of the pavement to validate drop off, and moving on with my day. Clown support is useless.
submitted by Modinstaller to deliveroos [link] [comments]


2024.06.01 14:16 Seline_Kirotashi First time kitten owner here, with a few questions regarding normal kitten behaviour

So I got a kitten for my 17th birthday yesterday (her name is Mitsy and she's the cutest little thing ever!) and she's also my first ever pet so I'm really worried that I'm gonna screw something up and to hopefully avoid that I have a few questions! She's 6 months and 3 weeks btw

  1. How often should I change the litterbox? I've been doing it immediately after my kitten uses it but I'm pretty sure that's not necessary.
  2. Is it okay for her to be licking her stitches from her desexing surgery thingy? I assumed it wouldn't be okay but the pet store I got her from didn't seem to care too much about it since she wasn't wearing a cone. Also, I'm not sure how old the stitches are but I have to take her to the vet in two weeks to get them removed if that matters
  3. She keeps trying to eat her litter (its absorbing litter) and I keep trying to distract her and/or firmly telling her 'no' and getting her to drop it but I'm not sure how to get her to stop completely or even if this is normal behaviour. It's probably super dangerous though
  4. Mitsy keeps randomly making a weird croaking sound and it's probably just a hairball but she also ate a dust ball a few hours ago before I was able to stop her and she ate something crunchy (probably litter) that I couldn't see a little bit after that, but like. I'm still a bit scared.
  5. I'm keeping the litterbox and food and water in my bedroom along with Mitsy for now because my Nanna doesn't want her wandering around the house at night, but I'm planning to move it all to the bathroom when she gets accustomed to my house and we get a cat fence for my Nanna's bedroom (she dislikes cats, the absolute monster!), but I was thinking that keeping the litterbox in my room is a bad idea and I feel kind of guilty about having her locked in my room all night and her waking up before me and probably wanting to explore and play
  6. Any tips on getting kittens to feel more comfortable around stairs? I want to let her know that downstairs is okay too (she seems to be okay upstairs now) and I've been trying to coax her down one step at a time using treats but its not really working. I also don't want to pick her up because I don't want to accidentally touch her stitches and hurt her.
  7. Since Mitsy is already litter trained, its okay for me to give her treats whenever right? And on a similar note, is it normal for kittens to avoid treats when they're in a new home?
  8. She hasn't done anything 'bad' yet, but when she does, what's a good way of teaching her not to do that? I don't want to yell at her or spray her with water or anything because that's mean and I want to be a nice cat mum and she doesn't know any better
  9. Is getting a cat harness worth it? I'm never going to let her out into the backyard or outside because of how easy it is for her to run into a possibly not so friendly cat or escape the backyard which is literally bordering a road that has a lot of cars and she doesn't know the area so she can't find her way back home. I'm thinking that maybe I can take her on walks like a dog but my family said that's stupid so I'm not sure
  10. I've been playing with her a lot with a feather toy (sorta, its actually a star on a string and it had a moon as well with bells but those came off) and she keeps jumping really high and twisting in midair which looks like it should be painful because of the stitches (can you tell I'm really worried about the stitches?) but maybe she's just really good at hiding pain?

I think that's it, please please please answer even one of these if you can and any assorted kitten raising tips are greatly appreciated!
submitted by Seline_Kirotashi to cats [link] [comments]


http://activeproperty.pl/