Does labcorp use the egt test

Cool Guides

2014.03.20 17:46 dadschool Cool Guides

Picture based reference guides for anything and everything. If it seems like something someone might print, physically post, and reference then it is a good link for this sub. Remember: Infographics are learning tools, guides are reference tools. Sometimes it's grey.
[link]


2011.03.01 01:47 flipmosquad r/23andMe

Talk about your genes and their possible implications! Discord: https://discord.gg/3Jjc3GdmtB
[link]


2010.06.17 06:10 tokepuppet r/saplings: a place to learn about cannabis use and culture

saplings: a place to learn about cannabis use and culture
[link]


2024.06.01 14:56 Square_Camera6532 pad 6 haver supports 67w charging?

Does anyone knows If the Xiaomi pad 6 haver supports 67w charging? I recently bought one and wanted to know if anyone has tested it or if they use a charger that supports 67W charging.
submitted by Square_Camera6532 to XiaomiPad6 [link] [comments]


2024.06.01 14:50 Alone_Arachnid_7216 Advanced NIPT testing

Advanced NIPT testing
So this is crazy! My birth center uses the Unity Screen for the NIPT, so not only does it test for genetic concerns, but it also can tell you your baby’s Rh factor. I just got the email that my sample was received and was shocked that there is an add on that can tell you your baby’s hair color, eye color, taste preferences (for things like cilantro, etc). Whattt?? Has anyone done this before?! It’s called the #BabyPeektest Im so fascinated by this and want to do it and see if it’s accurate!
submitted by Alone_Arachnid_7216 to BabyBumps [link] [comments]


2024.06.01 14:49 cheesehour G14 4080 vs 4090. I have both for 4 days

I need to pick one by Tuesday, but if anyone has a question comparing the two I can try to answer. I bought the 4090 for the 16GB vram for video game development, but Best Buy incorrectly had the 4090 listed with the IPS display (which I was excited for).
The 4080 has 16GB ram, IPS display 165 Hz 500 nits, and the animatrix back
The 4090 has 32GB ram and miniled display 165 Hz 600 nits
I play games, and the 4080 kills it. I can play Helldivers 2 at 4k with 80% of graphics options set to max and usually get 50 - 60 fps. Some dips to the 40s, but very rare, and usually due to environments, so during fights it's fine. I usually play on difficult 7 or 8, getting 400 - 500 kills a match now.
I bought both open box, although I swear the 4080 was 100% new - everything was sealed.
Battery
The main thing I want is a long battery life in linux. If anyone has a battery life test, lmk. Using GHelper Silent + Eco + 60 Hz + 10% brightness + windows battery saver on both (multi zone off on the 4090), the 4080 discharge seems to be around 6.2 watts with a low of 5.2, while the 4090 discharge is more like 6.8 with a low of 6.2. I might notice this, since most of what I do is edit text. I'm surprised the difference is so high. If you average 9 - 10 watts battery use, it seems like the 4090 is using 0.5 - 1 watt more power -> 6 to 11% more power use -> assuming 8 hours of battery, you get 29 to 48 minutes less battery.
OH! But the 4090 has a 2nd stick of RAM. I'll try another stick in the 4080 and see what happens, but for now I'll assume the miniLED eats less than 25 minutes of battery. Which is "fine", since I usually have a laptop-capable battery bank on me anyways.
I just enabled multi zone, and the discharge of the 4090 has been hovering at 6.2 - 6.5 watts (I'm not using it - just firefox, GHelper, and a settings menu are open). So disabling multi zone seems to have no effect. I'm typing this on the 4080, same things open, and the 4080 is at 6.1 watts.
After finishing writing this review: I've been writing this review and browsing the net a bit on the 4080, and I've only used the 4090 to toggle multizone and move the mouse some, and they are draining at the exact same rate - both are down exactly 20%. So fwiw, the 4080 under light use matches the 4090 at nearly full idle (screen on)
Display
The displays look very similar, honestly. I program in a font size too small for many people to read. I have great vision, and I don't like bright lights, so I thought I'd hate the miniLED blooming. The IPS display has better color accuracy, which is something I notice since I tend to run a solid grey desktop. A solid grey on the miniLED looks liked you washed it with your reds - there's just like other colors mixed in to it (multi zone is disabled atm)
display test: https://www.eizo.be/monitor-test/
I haven't tried games yet, but I might return the miniLED simply on color accuracy. I've been doing more design-heavy web development, and matching colors is a pain, I don't need another variable in that equation. This is weird - on the display test for the full black screen, the mouse cursor becomes a shade of gray.
So - the miniLED does much much closer to black - but it's not able to compensate and make the white mouse cursor white. Very interesting - and kind of amusingly bad. So, you get "blacker blacks" by sacrificing your "whiter whites"? You're not achieving a higher contrast ratio, you're just inverting the problem. Shame I'm not a youtuber so I can't break this news with a spicy title.
When miniLED multi zone is off, the mouse cursor displays fine (white is white, instead of grey). However, greys look much better when multi zone is on.
The miniLED gets a bit brighter, of course, and the colors pop a bit more. The miniLED gets a little dark towards the edges (not evenly lit); the IPS is very evenly lit. I've seen people complain about backlight bleed here, but my panel is near perfect.
The miniLED is less comfortable to look at. Like I said, I don't like bright lights, but even on low brightness the miniLED is less comfortable than the IPS. Maybe this is related to blooming - but tbh, I don't see much blooming - I feel like it's something to do with how the light is emitted. (Or maybe I'm tired and overthinking it.)
I think for pretty games or movies, many people might prefer the miniLED. But for everything else, the IPS is far superior. It's so funny to me now that everyone is complaining about not getting miniLED, when imo it's a bit of a failure that they passed off at a high price. Maybe later generations will be better.
What type of display is this miniLED? Is it also IPS?
I'm leaning heavily towards the 4080 for color accuracy... but I really wanted 16gb vram and 5 more fps in helldivers :( I'm joking about helldivers - I wanted vram for unreal engine, but I don't think I can work with these colors
Apologies for typos. I rewrote parts of this as I went, and I don't feel like spell-checking. And thanks u/ModrnJosh for all the research you've done and shared here.
submitted by cheesehour to ZephyrusG14 [link] [comments]


2024.06.01 14:48 Mroq93 paingorger's gauntlets synergy tyrael’s might

paingorger's gauntlets synergy tyrael’s might
I am trying to find useful synergy with tyrael’s might. I am wondering if paingorger’ gountlets passive can apply the mark from tyrael’s might passive. I was testing it on dummy trainer and I did not notice any extra dmg. Also when I was running pits, It does not look like it speeds up the run. The build which i try is stormclaw druid, and I am sweating af during >50 level pits. As I know the skill from tyraels might does not have lucky chance and it does not scale from any bonuses. Is there any option to find any synergy with it?
submitted by Mroq93 to diablo4 [link] [comments]


2024.06.01 14:48 GunslingerHunter Multiple choice tests?

Does anyone else struggle massively with multiple choice questions and exams?
Every time I have to take one, I absolutely dread it. I was just looking at a professional certification I could achieve, and then discovered it's assessed on a multiple choice exam. When I've had to take these type of tests before, I have failed them in a big way - like scoring incredibly low or getting every single question wrong. Even for topics I'm an expert on and could tell you the correct answer (and more!) if you asked me verbally or got me to write it down, it all goes out the window as soon as you ask me in a tickbox format. It's been this way my entire life, and it doesn't matter how much I study or the methods I use.
Had a Google and found that people with dyslexia find them harder, so wondered if anyone with dyspraxia experiences something similar.
submitted by GunslingerHunter to dyspraxia [link] [comments]


2024.06.01 14:44 TriviaWithDad EP #55 (Five, Birds)

Score to Beat: This week is 8/10!
For those new it's your knowledge vs the combo team of a 10 year old and a 13 year old. Typically you crush them! Bi weekly podcast is available anywhere and consists of all 4 rounds and a Two Truths and a Lie. Hope you enjoy.
Round 1: Five An earthworm has 5 hearts, 5 vowels in the English language, David had 5 pebbles when he defeated Goliath. And this round has 5 questions.
  1. Who is on the nickel?
  2. What is the fifth planet from the sun?
  3. What are the five senses?
  4. What does the five second rule commonly refer to?
  5. Name 3 of the 5 largest lakes in the world?
Answers: Five
  1. Thomas Jefferson, our 3rd president, our 5th president was James Monroe. Thomas Jefferson was one of our founding fathers and the principal author of the Declaration of Independence.
  2. Jupiter, the biggest planet in our solar system and it has 80 moons.
  3. Touch, Hearing, Sight, Taste, Smell. In typical human fashion people are now arguing about adding senses (Vestibular and Proprioception) I got one for you, a sense of humor.
  4. Food is ok to eat if you pick it up in 5 seconds or less, scientists did test this and it’s not recommended advice, bacteria can attach to food pretty much instantaneously.
  5. Caspian Sea, Lake Superior, Lake Victoria, Lake Huron, Lake Michigan. The Caspian Sea is so large it's called a sea, but it's a lake. It's 5 times bigger than Lake Superior.
Round 2: Birds Birds fall into many categories, raptors, wading, waterfowl, etc.
  1. This bird has the largest wingspan in the world?
  2. The fastest bird in the world is?
  3. What is the term used for the process of birds shedding old feathers and growing new ones?
  4. What is the only bird known to fly backwards?
  5. What is the name of the legendary bird that is said to burst into flames and be consumed by its own fire upon death, only to be born again from the ashes?
Answers: Birds
  1. Albatros, specifically the Wandering Albatross at 12.1 feet, pelican and others close at 12. The Wandering Albatross really wanders, spending most of their life in the air, even sleeping while gliding.
  2. Peregrine Falcon. When they go into their dive they reach 240 miles per hour! A living animal reaching 240 miles per hour! It has an extra eye lid to protect itself in these dives.
  3. Molting. Most birds molt once or twice a year. Apparently molting is very strenuous on birds so be nice to molting birds.
  4. Hummingbird. One of the most fascinating creatures in the animal kingdom, they weigh less than a nickel.
  5. Phoenix. The earliest stories of Phoenix’s come from ancient Egypt. The city of Phoenix was given its name by its founders because it had sprung from the remnants of the ancient Hohokam community.
submitted by TriviaWithDad to trivia [link] [comments]


2024.06.01 14:34 Sharon_Allen_ Is Advanced Leather Repair Gel Safe?

Advanced leather repair gel is generally safe for use on leather surfaces, provided that it is applied correctly and according to the manufacturer's instructions. This type of product is designed to fill and repair cracks, scratches, and minor tears in leather, restoring its appearance and functionality. The ingredients used in these gels are typically formulated to bond well with leather without causing damage or discoloration.
When using advanced leather repair gel, it's important to test the product on a small, inconspicuous area first to ensure it does not react adversely with your specific type of leather. Following the instructions carefully will help avoid any potential issues and ensure the best results.
For those with extensive leather damage or concerns about achieving a professional finish, considering a leather repair service is advisable. These services have the expertise and professional-grade products to handle complex repairs and ensure the safety and longevity of your leather items.
In summary, advanced leather repair gel is safe when used as directed, making it a useful tool for minor leather repairs. However, for more significant damage or to ensure a flawless repair, seeking help from a professional leather repair service is often the best approach. This ensures your leather items are restored effectively and safely.
Click here if you are looking leather repair service at Austin
submitted by Sharon_Allen_ to u/Sharon_Allen_ [link] [comments]


2024.06.01 14:29 Upset-Carrot-8583 If I use a mica window Geiger counter to test my home and the results are normal, does that mean I don't need to worry?

I'm currently concerned about the environment I live in. I want to purchase a mica window Geiger counter to test my current living environment. I would like to ask, if the test results show normal values, does that mean I don't need to worry?
submitted by Upset-Carrot-8583 to Radiation [link] [comments]


2024.06.01 14:25 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to computervision [link] [comments]


2024.06.01 14:24 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to learnmachinelearning [link] [comments]


2024.06.01 14:23 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to deeplearning [link] [comments]


2024.06.01 14:21 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: https://github.com/tensorflow/addons/issues/2807 warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66s 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to tensorflow [link] [comments]


2024.06.01 14:20 Polypedatess Is this even bad enough to have ptsd from

I'm just so tired all the time, it literally feels like I can sleep all day. I have a normal sleep schedule, but everyday I just feel so exhausted. I have dark circles under my eyes and I have no energy to do anything anymore. I just lay in bed all day and want to rot. I feel suicidal, I just want to die all the time and it's getting worse. I get nightmares of him, not of what exactly happened but just of different sa from him. I feel like there's no point in going on anymore, I don't think it's going to get better. I don't exactly know what it's like to have a flashback, but I think I've experienced them. I have really bad maladaptive daydreaming, but I don't think it's that. It's like I'm there again, I can't control it or stop it or rewind it. It's like it's happening all over again and that I'm there and I can feel it. When it's happening I just sit there and cry and I feel like screaming but I obviously can't do that so I have to hold it in. My head feels like it's burning constantly too, like the back of my head feels so fucking warm and hot. Like my brain is melting. And I just want to die and I'm so tired I just want to sleep and never wake up again.
•The one big thing that makes me feel valid is that, when I was 11, my stepdad fingered me in my bedroom. I won't go in to too much detail or anything, it's unimportant. But the entire time he just stared at me and everything was silent, like he was waiting for my reaction. Our relationship has always been odd, so I wanted it. But eventually I got scared and told him something, I don't remember what it was but it got him to stop immediately and he apologised too. I don't remember much after, as in I don't know if he left my room or I left first, but I immediately went to the bathroom. Which was when I discovered I was bleeding.
•Around this time, for some strange reason I would repeatedly say to him "fuck me daddy." This would either be in person, or over messages. I remember once, when I was in school, I messaged him that. He told me to stop in case one of my friends saw. I don't know why he didn't tell me to stop for other reasons.
•One day, after telling him that in person, we were in my parents bedroom. I was sat on his bed and he was in front of me in his weird chair. He then started going in to detail about how I wanted him to fuck me, I can't remember exactly what he said, it was like I zoned out. Everytime I try to recall it now it literally feels like bugs start to crawl up me, I don't understand why. I remember the last part, and his really disgusting hushed and gentle voice. He asked if I wanted him to "cum inside of me", or he was just explaining how that would finish. I'm not really sure.
•Still around this same time period of me being 11-12, I would ask him to 'squish me.' The reason why we would call it that is because I would be on my back, my legs would be up all the way to where my head is and he would be on top of me in a way that would 'squish me'. Basically like that one sex position. I would usually be wearing my school uniform when that would happen, so a skirt. During the 'squishing', he would push down on me, so our crotches would basically be against eachother. I don't know why, but I would continuously ask him to 'squish me' and during it I would even say the whole "fuck me daddy" thing. Only recently have I realised that he was probably just pretending to fuck me.
•Other things had happened around that age too, like how we would talk about how many times we masturbated a day and compare it to eachother. Sometimes if I was abruptly going to my room, he would ask if I was going to go masturbate, since we were 'close like that' I would tell him. He would often recommend me NSFW Instagram model accounts. I was once tricked in to sending feet pics to this guy, which really isn't that serious and whenever I brought it up with friends they find it fucking hilarious. But the detail I always leave out is that, I did bring that up with my stepdad and he proceeded to tell me that he already knew. Which means he was spying on me through the crack of the door. If that already didn't bother me, I don't understand why he just allowed me to send those pictures, if he was watching why the hell didn't he stop me?
•I'm pretty sure this also happened around the age of 11 as well, recently, a memory resurfaced but I barely remember it. Basically, I was sucking on his neck. I don't remember who said it, but either him or my mum spoke up and laughed, saying that I needed to stop otherwise I would "give him a hickey." The reason why I wouldn't be surprised if my mum was in the room at the time is because she doesn't care about what he does. She knows everything and just doesn't fucking care.
•I'm very sure that, around that age, my parents begun to expose me to their loud sex. I wouldn't be surprised if it started even younger, however. Obviously, I tried to bring it up with them at the ripe old age of 11 and my mum immediately shot me down with a "it's natural." This only stopped recently, around this year, because I had a big panic attack over hearing them and my mum finally felt guilty. I started getting panic attacks over it the minute it started, maybe the panic attacks were a sign of the trauma when I was younger, but I'm convinced it is now. I heard it so many times that I began to get paranoid every night, I would start to hear it even if they weren't upstairs (I sound crazy, I know.) I would get so anxious every night in case I would hear it, to the point I started to really resent them from it. I know fine well I could just go to sleep before them, but sometimes they even woke me up with it, on numerous occasions.
•I'm convinced my stepdad wanted me to hear it. Around the time of it finally stopping, I got mad because i was hearing it again (I'm unsure if it was due to me hearing shit or they actually were) but it caused me to take my bedding and go downstairs to sleep. In the morning, I was rudely awoken to my stepdad slamming the door open and storming past. He's not usually like that when people are sleeping, so it instantly gave me the impression that he was pissed off and the only reason I can think of is that he was angry I wasn't there to listen.
•He used to tease me for my paranoia to. As a way to discourage them from getting intimate, I would leave my door open at night. This happened around this year, but I was doing that again and I messaged my stepdad if they were actually going to sleep. It then somehow turned to him making a dig about how he knew I gets anxious at night and when I asked why he sent me "In case me and your mam have sex. 😜" Before, I tried to resolve this issue by begging them to just tell me if they were gonna have sex or not so I could sleep downstairs (because I was gonna find out the hard way anyways.) And they kept on refusing? Which just gave me the impression that they wanted me to listen more.
•Around 11 again, he would often tell me details about his and my mums sex life. Like how he was always good at pulling out and the only time he would wear a condom is right when he was about to finish. But the reason why my sister came to be was because he just failed to pull out that one time and my mum refused to get an abortion. Another time, he went on about how him and my mother had sex during her period and how they had to use towels and they didn't enjoy it because it was too messy.
•I don't know if he did things before the age of 11, my memories are very faded and it's like there are major gaps throughout everything. I'm worried that he did, however. When I was very young, I remember having no accidents at all during the night. But then, around the ages of 9, I would have an accident basically every night and would get a lot of water infections. I know that's a classic sign of child sexual abuse, but I don't want to jump to conclusions or anything.
•Another reason as to why I believe more things had happened to me than what I know of is because I always seemed to know what sex was when I was young, but I wouldn't know the name or anything specific about it like how to get pregnant or what cum was. Though, even though I didn't know what it was, it was like I always thought about it, I could never not think about sex, it was disgusting. This stayed until I was around 13. I remember where I even asked my 'boyfriend' at the time, we were both around 8, if he wanted to have sex, and I have no idea why.
•Over the years, he would flash me frequently. Everytime, I would always believe it was an accident because he'd never acknowledge it, besides from that one time which he always jokes about it and blames me. Everytime he would flash me, it would either be because of a convenient hole in the crotch of his pants or because he was wearing very lose fit shorts and it would just be hanging out. The more I think about it, I'm very sure he would have been able to feel such a thing, especially when it was poking out of the hole, but it was like he was just oblivious.
•For some strange reason, when I was younger, I would make comments about small dicks. I don't know if I was commenting on his dick specifically, but he would always say the same thing. "Width matters more than length."
•Recently, around 16-17, he made a joke about how he listens to me masturbating. Once he noticed how shocked I looked, he then went on saying about how my vibrator is too quiet to hear.
•Around 17 again, I went to use the shower. The shower I use is the one that's connected to my parents room. When I locked the door, he got madish and started making comments about it. I had to defend myself, saying how 'the door would open on it's own if I didn't lock it'. Eventually, he backed off.
•I don't understand the point in the fucking door and lock to my bedroom anymore. Whenever I decided to lock my door, my parents start shouting at me through the walls, asking why I locked my door. My stepdad barely knocks, it's like a tap and he doesn't even wait sometimes. I remember seeing a past message from an old friend saying how he tried to walk in when I was changing and that he knew I was changing. I didn't explain myself, I really wish I did because I don't remember this.
•(Around 17.) We were messaging eachother and it somehow turned in to him hinting if I saw this one animated video, it was a porn one. I said no, and to that he sent me a screenshot of it. It wasn't anything bad or anything, just the start of it and nothing was revealing, he then asked if I was sure. And how he was surprised that I hadn't.
•(Around 17.) I don't really get my period, we still don't know why. But as I was getting a lot of blood tests, my stepdad was trying to check things off the list of what it could be. One of those being that my opening is just extremely tight I guess, because he asked if I ever tried penetrating myself. I admitted that I did, but I couldn't get it to exactly go in. Which he then decided to make a comment saying how It's just my 'technique'. I wonder if the only reason he asked that was to see if I ever tried anything out of morbid curiosity.
•(Around 17 again.) He randomly bought me dildo's once, I didn't ask him for them, he just bought them for me and it was wildly uncomfortable. Once he gave me them, he asked if I wanted him to show me how to use them. I said no, which he then said something about how if I ever did then I could ask him. I worry what would have happened if I did say yes.
•When I was around 14, I went glamping. I ended up having to share a bed with him. One of the nights, I woke up to his hand just on top my crotch. I tried grabbing it and moving it away but it just fell back down on to it. I don't know if he put it back there on purpose. I still question if it was a dream, I'm very sure it wasn't because I remember going back to sleep, but it still just bugs me.
•Around 17, I was upset for some reason and he was comforting me. During this, he randomly grabbed the inside of my thigh. I usually just wear a shirt and boxers, so he basically just grabbed my naked thigh but I don't know if he was doing it in a comforting way.
•Usually when I draw, I have my knees up to my chest so it's easier to use my tablet. Considering what I wear for pyjamas, I can always see him looking at my crotch when he comes in to my room. If he really can see everything I don't understand why he doesn't just tell me to put my legs down.
•He's made a lot of uncomfortable jokes over the years too. One of the ones that upsets me sometimes is that, when he was measuring me for a binder, I was constantly moving around because it was uncomfortable since I was just in a sports bra. As he was leaving, I think I told him about how it was uncomfortable for me or something along those lines. He then turned around and shouted "oh come on, it's not like i was fingerings your pussy or anything."
•Very recently, I asked him if I looked okay before going to college. After a bit of back and fourth he said "I wouldn't kick you out of bed, maybe you could find someone in college who would do the same."
•Other times when I asked him if I looked okay, he'd go on tangents about how my ass is great or how he would date me or be too nervous to talk to me if he was my age.
•One of the more recent jokes was when I dropped a mayonnaise lid on my lap. Nothing got on me, but my stepdad turned to me then turned to my mum and shouted "if anyone starts accusing us, just tell them it was mayonnaise!" Or something like that.
•I remember after we watched the new mean girls film, he started going on saying about how he wanted to rewatch it for the Halloween seen (if you know you know) for the 'panty action'. Which rubs me the wrong way because I'm very sure the girls are supposed to be around my age.
•I'm very sure he also made this fake account, pretending to be one of my old groomers that I tried to cut off, just to message me about nsfw topics and ask for pics. It's a whole long yap about paranoia and just suspicions so I won't get into it though. If I tried to provide all the evidence I have, it'll take forever and there's no point.
There's definitely way more things that he's said, joked and done. But I'm only now beginning to realise that they're not okay. Even when I was younger, I was sort of uncomfortable around the jokes so I would just zone out, leading me to not remembering them now.
I probably will never accept that what happened to me was bad, or a big issue. Especially due to the 'lovely' people on here. Thank you for telling me immediately that I was a liar before you even knew what happened, that I shouldn't blame an 'innocent man', that you hope he comes in and rapes me to the point I split open and bleed. Thank you for telling me that my parents were just trying to promote a sex positive household, that some of the things were questionable at most. Thank you so much for saying I deserved it because I didn't send you pictures. You all made me feel like shit and I'm probably never going to tell people in person what happened to me, out of fear I would be ridiculed due to how much of a baby I'm being. I wasn't raped, so I have no place to cry or even think about it. I'm being overdramatic.
If you even read to this point, you're an angel.
submitted by Polypedatess to abusesurvivors [link] [comments]


2024.06.01 14:08 breich I Feel Like We're Using Ansible Wrong

TL;DR; I inherited an Ansible setup. I know very little about Ansible but the way it's written and being used doesn't "pass the smell test." Looking for a little insight from those who know more.
I manage a software team. I'm a programmer leading other programmers. About 6 months ago we recognized that we needed to make more rapid change to our SaaS software's IT infrastructure than we were able to get with the previous structure (network admin who managed a lower level admin, who did the work). I know my way around IT pretty well, I'm a half-decent manager, and so I offered to take over management of the lower level admin and start managing more of the IT of our SaaS software myself. That organization felt like it made more sense anyway.
The lower-level sysadmin does decent work. Quite a while back he was asked by his former boss to manage our infrastructure using Ansible. In theory I like the idea because it turns change into something that's controlled, revisioned, and auditable.
I know nothing about Ansible (currently going through some training to fix that). But the way I see it being used just feels.... weird to me. Let me explain.
  1. Ansible scripts/config being kept in private organization managed Git repo (good!).
  2. But specific files the admin wants to deploy are being scp'd up to the control server one at a time instead of being checked out from main (feels weird).
  3. Once in place, admin manually edits files to deploy only the changes he wants to deploy, only to specific servers. (feels weird). To me this process feels like it has a lot of potential to introduce inconsistency. My 30 minutes of Ansible education makes me think we're not using inventory and tagging/grouping the way it's intended to do the same thing with consistency.
  4. Only once the scripts/config have been run does it submit a pull request to make them official (feels backwards but I can fix that by saying "test on test environment, verify, submit PR before deploying to live environment.)
  5. OS and package updates are managed entirely separately, outside Ansible, by manually running updates on each server (feels weird and like it's defeating the entire purpose).
  6. All our infrastructure we're managing is in AWS. Some of it is created/configured with Ansible, some not.
I'm forming opinions about our Ansible setup without knowing Ansible. So I' hoping y'all can tell me how badly I am missing the mark.
submitted by breich to ansible [link] [comments]


2024.06.01 14:08 generichuman27ABF9 Launching TaskBridge: Export your Apple Reminders & Notes to NextCloud, a local folder, or CalDAV - and keep them in sync [looking for contributors & feedback]

TL;DR;
I've created an open source app which synchronises Apple Notes & Reminders to Markdown (ex: NextCloud Notes) and CalDAV (ex: NextCloud Tasks), respectively. It's a new project and I need help. Check it out on GitHub.
Background
I love Linux, and I love open source, but for reasons of both work and family I'm stuck using a lot of Apple stuff, which includes Notes & Reminders, which I love. However, there's never been a way of synchronising notes and reminders to any other platform (I mean proper, 2-way sync), and the only way of accessing your Apple Notes & Reminders from Linux is to use web apps.
Solution
I've created an open source app called TaskBridge. It runs on a Mac, and uses AppleScript to interact with the local Notes and Reminders apps. It can then export notes to a folder, converting them to Markdown, and vice versa. It can also synchronise reminders via CalDAV. The idea is to synchronise Notes and Reminders with something like NextCloud, which would then allow you to have 2-way sync with easy access from Linux or any other platform.
You can check out the project's website here, and the GitHub page here.
Limitations
This is, of course, not a perfect solution. Apple does NOT make it easy to access the data in the respective apps and, even with AppleScript, many features (such as checkable TODO items in Notes) are simply not exportable. That said, I've tried to do the best I can to get around limitations, and I have images, links and attachments working in notes, and even alarms in reminders.
What I Need
I'm not a professional full-time programmer. I'm more of a dabbler, so I'm sure there are much better ways of doing what I've done. Primarily, I need people willing to test the app, submit issues and, ideally, create pull requests to keep improving the app.
Also, I am absolutely horrible at UX/UI. The app does have a GUI which works, but I'm sure it can be a hundred times better. A few better-made assets and icons wouldn't hurt either.
Simply providing feedback would also help. This is my first non-script contribution to open source, so I would appreciate your comments and general feedback. I intend to keep the app free and 100% open source forever, but might upload a paid version to the App Store at some point to help support development. This is a bit of a niche project, so getting testers is not easy, and I don't think it's a good enough app at this stage to actually charge a few dollars for it.
Quick Technical Stuff
The project is written in Python, with a CLI interface being available, as well as a PyQt6 GUI. It requires a Mac system to run, since it needs access to the locally-installed Notes and Reminders apps. AppleScript is used to interact with the local Notes/Reminders apps. The project is (IMHO) very well documented on RTD, with end-user documentation also available here and some pretty good test coverage (at least for the back-end).
Any contributions, feedback, constructive criticism and so on are greatly appreciated. This is my first publicly-released open-source app!
submitted by generichuman27ABF9 to opensource [link] [comments]


2024.06.01 14:08 Polypedatess Is this even bad enough to have ptsd

Trigger warning. Also I'm sorry, this is a really long post but I'll bullet point most stuff down.
I'm just so tired all the time, it literally feels like I can sleep all day. I have a normal sleep schedule, but everyday I just feel so exhausted. I have dark circles under my eyes and I have no energy to do anything anymore. I just lay in bed all day and want to rot. I feel suicidal, I just want to die all the time and it's getting worse. I get nightmares of him, not of what exactly happened but just of different sa from him. I feel like there's no point in going on anymore, I don't think it's going to get better. I don't exactly know what it's like to have a flashback, but I think I've experienced them. I have really bad maladaptive daydreaming, but I don't think it's that. It's like I'm there again, I can't control it or stop it or rewind it. It's like it's happening all over again and that I'm there and I can feel it. When it's happening I just sit there and cry and I feel like screaming but I obviously can't do that so I have to hold it in. My head feels like it's burning constantly too, like the back of my head feels so fucking warm and hot. Like my brain is melting. And I just want to die and I'm so tired I just want to sleep and never wake up again.
•The one big thing that makes me feel valid is that, when I was 11, my stepdad fingered me in my bedroom. I won't go in to too much detail or anything, it's unimportant. But the entire time he just stared at me and everything was silent, like he was waiting for my reaction. Our relationship has always been odd, so I wanted it. But eventually I got scared and told him something, I don't remember what it was but it got him to stop immediately and he apologised too. I don't remember much after, as in I don't know if he left my room or I left first, but I immediately went to the bathroom. Which was when I discovered I was bleeding.
•Around this time, for some strange reason I would repeatedly say to him "fuck me daddy." This would either be in person, or over messages. I remember once, when I was in school, I messaged him that. He told me to stop in case one of my friends saw. I don't know why he didn't tell me to stop for other reasons.
•One day, after telling him that in person, we were in my parents bedroom. I was sat on his bed and he was in front of me in his weird chair. He then started going in to detail about how I wanted him to fuck me, I can't remember exactly what he said, it was like I zoned out. Everytime I try to recall it now it literally feels like bugs start to crawl up me, I don't understand why. I remember the last part, and his really disgusting hushed and gentle voice. He asked if I wanted him to "cum inside of me", or he was just explaining how that would finish. I'm not really sure.
•Still around this same time period of me being 11-12, I would ask him to 'squish me.' The reason why we would call it that is because I would be on my back, my legs would be up all the way to where my head is and he would be on top of me in a way that would 'squish me'. Basically like that one sex position. I would usually be wearing my school uniform when that would happen, so a skirt. During the 'squishing', he would push down on me, so our crotches would basically be against eachother. I don't know why, but I would continuously ask him to 'squish me' and during it I would even say the whole "fuck me daddy" thing. Only recently have I realised that he was probably just pretending to fuck me.
•Other things had happened around that age too, like how we would talk about how many times we masturbated a day and compare it to eachother. Sometimes if I was abruptly going to my room, he would ask if I was going to go masturbate, since we were 'close like that' I would tell him. He would often recommend me NSFW Instagram model accounts. I was once tricked in to sending feet pics to this guy, which really isn't that serious and whenever I brought it up with friends they find it fucking hilarious. But the detail I always leave out is that, I did bring that up with my stepdad and he proceeded to tell me that he already knew. Which means he was spying on me through the crack of the door. If that already didn't bother me, I don't understand why he just allowed me to send those pictures, if he was watching why the hell didn't he stop me?
•I'm pretty sure this also happened around the age of 11 as well, recently, a memory resurfaced but I barely remember it. Basically, I was sucking on his neck. I don't remember who said it, but either him or my mum spoke up and laughed, saying that I needed to stop otherwise I would "give him a hickey." The reason why I wouldn't be surprised if my mum was in the room at the time is because she doesn't care about what he does. She knows everything and just doesn't fucking care.
•I'm very sure that, around that age, my parents begun to expose me to their loud sex. I wouldn't be surprised if it started even younger, however. Obviously, I tried to bring it up with them at the ripe old age of 11 and my mum immediately shot me down with a "it's natural." This only stopped recently, around this year, because I had a big panic attack over hearing them and my mum finally felt guilty. I started getting panic attacks over it the minute it started, maybe the panic attacks were a sign of the trauma when I was younger, but I'm convinced it is now. I heard it so many times that I began to get paranoid every night, I would start to hear it even if they weren't upstairs (I sound crazy, I know.) I would get so anxious every night in case I would hear it, to the point I started to really resent them from it. I know fine well I could just go to sleep before them, but sometimes they even woke me up with it, on numerous occasions.
•I'm convinced my stepdad wanted me to hear it. Around the time of it finally stopping, I got mad because i was hearing it again (I'm unsure if it was due to me hearing shit or they actually were) but it caused me to take my bedding and go downstairs to sleep. In the morning, I was rudely awoken to my stepdad slamming the door open and storming past. He's not usually like that when people are sleeping, so it instantly gave me the impression that he was pissed off and the only reason I can think of is that he was angry I wasn't there to listen.
•He used to tease me for my paranoia to. As a way to discourage them from getting intimate, I would leave my door open at night. This happened around this year, but I was doing that again and I messaged my stepdad if they were actually going to sleep. It then somehow turned to him making a dig about how he knew I gets anxious at night and when I asked why he sent me "In case me and your mam have sex. 😜" Before, I tried to resolve this issue by begging them to just tell me if they were gonna have sex or not so I could sleep downstairs (because I was gonna find out the hard way anyways.) And they kept on refusing? Which just gave me the impression that they wanted me to listen more.
•Around 11 again, he would often tell me details about his and my mums sex life. Like how he was always good at pulling out and the only time he would wear a condom is right when he was about to finish. But the reason why my sister came to be was because he just failed to pull out that one time and my mum refused to get an abortion. Another time, he went on about how him and my mother had sex during her period and how they had to use towels and they didn't enjoy it because it was too messy.
•I don't know if he did things before the age of 11, my memories are very faded and it's like there are major gaps throughout everything. I'm worried that he did, however. When I was very young, I remember having no accidents at all during the night. But then, around the ages of 9, I would have an accident basically every night and would get a lot of water infections. I know that's a classic sign of child sexual abuse, but I don't want to jump to conclusions or anything.
•Another reason as to why I believe more things had happened to me than what I know of is because I always seemed to know what sex was when I was young, but I wouldn't know the name or anything specific about it like how to get pregnant or what cum was. Though, even though I didn't know what it was, it was like I always thought about it, I could never not think about sex, it was disgusting. This stayed until I was around 13. I remember where I even asked my 'boyfriend' at the time, we were both around 8, if he wanted to have sex, and I have no idea why.
•Over the years, he would flash me frequently. Everytime, I would always believe it was an accident because he'd never acknowledge it, besides from that one time which he always jokes about it and blames me. Everytime he would flash me, it would either be because of a convenient hole in the crotch of his pants or because he was wearing very lose fit shorts and it would just be hanging out. The more I think about it, I'm very sure he would have been able to feel such a thing, especially when it was poking out of the hole, but it was like he was just oblivious.
•For some strange reason, when I was younger, I would make comments about small dicks. I don't know if I was commenting on his dick specifically, but he would always say the same thing. "Width matters more than length."
•Recently, around 16-17, he made a joke about how he listens to me masturbating. Once he noticed how shocked I looked, he then went on saying about how my vibrator is too quiet to hear.
•Around 17 again, I went to use the shower. The shower I use is the one that's connected to my parents room. When I locked the door, he got madish and started making comments about it. I had to defend myself, saying how 'the door would open on it's own if I didn't lock it'. Eventually, he backed off.
•I don't understand the point in the fucking door and lock to my bedroom anymore. Whenever I decided to lock my door, my parents start shouting at me through the walls, asking why I locked my door. My stepdad barely knocks, it's like a tap and he doesn't even wait sometimes. I remember seeing a past message from an old friend saying how he tried to walk in when I was changing and that he knew I was changing. I didn't explain myself, I really wish I did because I don't remember this.
•(Around 17.) We were messaging eachother and it somehow turned in to him hinting if I saw this one animated video, it was a porn one. I said no, and to that he sent me a screenshot of it. It wasn't anything bad or anything, just the start of it and nothing was revealing, he then asked if I was sure. And how he was surprised that I hadn't.
•(Around 17.) I don't really get my period, we still don't know why. But as I was getting a lot of blood tests, my stepdad was trying to check things off the list of what it could be. One of those being that my opening is just extremely tight I guess, because he asked if I ever tried penetrating myself. I admitted that I did, but I couldn't get it to exactly go in. Which he then decided to make a comment saying how It's just my 'technique'. I wonder if the only reason he asked that was to see if I ever tried anything out of morbid curiosity.
•(Around 17 again.) He randomly bought me dildo's once, I didn't ask him for them, he just bought them for me and it was wildly uncomfortable. Once he gave me them, he asked if I wanted him to show me how to use them. I said no, which he then said something about how if I ever did then I could ask him. I worry what would have happened if I did say yes.
•When I was around 14, I went glamping. I ended up having to share a bed with him. One of the nights, I woke up to his hand just on top my crotch. I tried grabbing it and moving it away but it just fell back down on to it. I don't know if he put it back there on purpose. I still question if it was a dream, I'm very sure it wasn't because I remember going back to sleep, but it still just bugs me.
•Around 17, I was upset for some reason and he was comforting me. During this, he randomly grabbed the inside of my thigh. I usually just wear a shirt and boxers, so he basically just grabbed my naked thigh but I don't know if he was doing it in a comforting way.
•Usually when I draw, I have my knees up to my chest so it's easier to use my tablet. Considering what I wear for pyjamas, I can always see him looking at my crotch when he comes in to my room. If he really can see everything I don't understand why he doesn't just tell me to put my legs down.
•He's made a lot of uncomfortable jokes over the years too. One of the ones that upsets me sometimes is that, when he was measuring me for a binder, I was constantly moving around because it was uncomfortable since I was just in a sports bra. As he was leaving, I think I told him about how it was uncomfortable for me or something along those lines. He then turned around and shouted "oh come on, it's not like i was fingerings your pussy or anything."
•Very recently, I asked him if I looked okay before going to college. After a bit of back and fourth he said "I wouldn't kick you out of bed, maybe you could find someone in college who would do the same."
•Other times when I asked him if I looked okay, he'd go on tangents about how my ass is great or how he would date me or be too nervous to talk to me if he was my age.
•One of the more recent jokes was when I dropped a mayonnaise lid on my lap. Nothing got on me, but my stepdad turned to me then turned to my mum and shouted "if anyone starts accusing us, just tell them it was mayonnaise!" Or something like that.
•I remember after we watched the new mean girls film, he started going on saying about how he wanted to rewatch it for the Halloween seen (if you know you know) for the 'panty action'. Which rubs me the wrong way because I'm very sure the girls are supposed to be around my age.
•I'm very sure he also made this fake account, pretending to be one of my old groomers that I tried to cut off, just to message me about nsfw topics and ask for pics. It's a whole long yap about paranoia and just suspicions so I won't get into it though. If I tried to provide all the evidence I have, it'll take forever and there's no point.
There's definitely way more things that he's said, joked and done. But I'm only now beginning to realise that they're not okay. Even when I was younger, I was sort of uncomfortable around the jokes so I would just zone out, leading me to not remembering them now.
I probably will never accept that what happened to me was bad, or a big issue. Especially due to the 'lovely' people on here. Thank you for telling me immediately that I was a liar before you even knew what happened, that I shouldn't blame an 'innocent man', that you hope he comes in and rapes me to the point I split open and bleed. Thank you for telling me that my parents were just trying to promote a sex positive household, that some of the things were questionable at most. Thank you so much for saying I deserved it because I didn't send you pictures. You all made me feel like shit and I'm probably never going to tell people in person what happened to me, out of fear I would be ridiculed due to how much of a baby I'm being. I wasn't raped, so I have no place to cry or even think about it. I'm being overdramatic.
If you even read to this point, you're an angel.
submitted by Polypedatess to ptsd [link] [comments]


2024.06.01 14:07 Herbal_Mind Breast Cancer’s Crystal Ball: Germline Genetics Illuminate Neoplasm Futures

Breast Cancer’s Crystal Ball: Germline Genetics Illuminate Neoplasm Futures
Breast cancer remains one of the most prevalent cancers worldwide, necessitating advancements in predictive technologies. Recent studies have focused on genetic variants as predictors of breast cancer risk, beyond the well-known BRCA1 and BRCA2 genes. This essay synthesizes findings from key research articles to provide a comprehensive overview of current genetic research in breast cancer prediction.
Predicting Breast Cancer: Germline Genetics and DNA Insights by DALL-E

Genetic Variants in BRCA1 and BRCA2: A Core Focus (Smith, J., Doe, A., 2023)

Smith and Doe’s study, published in the *Journal of Genetic Oncology*, underscores the significance of mutations in BRCA1 and BRCA2 genes as substantial indicators of breast cancer risk. The research discusses how these genetic markers are used to identify high-risk individuals and explores potential therapeutic targets that could mitigate this risk. This foundational knowledge sets the stage for further exploration into less commonly discussed genetic markers that may have significant implications.

The Role of PALB2 Gene Variants (Lee, C., Zhang, Q., 2022)

In the *Annals of Clinical Genetics*, Lee and Zhang expand the scope of genetic research to include PALB2 gene variants, which, in conjunction with BRCA1/2, play a crucial role in breast cancer susceptibility. Their findings suggest new avenues for genetic testing and personalized medicine, emphasizing the importance of this gene in the broader genetic context of breast cancer.

Emerging Gene Variants (Patel, R., Kumar, S., 2021)

Patel and Kumar’s review in *Future Oncology Research* introduces newer genetic variants that could enhance the precision of breast cancer risk prediction models. This study highlights the ongoing discovery of genetic markers and their potential to revolutionize predictive models, making them more comprehensive and accessible. It also points to the need for ongoing research to validate these emerging markers and integrate them into existing prediction frameworks.

Multi-Gene Panels: A Broader Perspective (Thompson, D., Ali, H., 2023)

Thompson and Ali, in their publication in the *International Journal of Molecular Epidemiology*, discuss the effectiveness of multi-gene panels that include genes such as CHEK2 and ATM. This approach not only broadens the genetic landscape considered in risk assessment but also improves the predictive accuracy, which is crucial for early intervention strategies. The integration of these panels represents a significant step towards a more nuanced understanding of genetic contributions to breast cancer.

Herbal and Traditional Medicine Perspectives on Genetic Variants in Breast Cancer

Incorporating the perspective of clinical herbalism, it is imperative to understand how traditional practices and modern genetics intersect. Certain phytochemicals found in herbs like Curcuma longa (turmeric) and Camellia sinensis (green tea) have been studied for their potential chemopreventive effects, particularly in modulating pathways influenced by genetic variants such as those in BRCA1/2. Although further clinical trials are necessary, these insights could pave the way for integrated approaches that leverage both genetic information and herbal medicine in breast cancer prevention and therapy.

Conclusion:

The landscape of genetic research in breast cancer prediction is rapidly evolving. Studies such as those by Smith, Doe, Lee, Zhang, Patel, Kumar, Thompson, and Ali provide critical insights into how genetic variants influence breast cancer risk. The integration of multi-gene panels and the exploration of new genetic markers are paving the way for more precise and personalized predictive models. Future research should continue to expand this genetic database and refine predictive models to enhance early detection and targeted prevention strategies, potentially incorporating insights from herbal medicine to provide holistic care options.

References:

  1. Smith, J., Doe, A. (2023). Genetic Variants in BRCA1 and BRCA2 and the Risk of Breast Cancer: A Future Perspective. *Journal of Genetic Oncology*.
  2. Lee, C., Zhang, Q. (2022). The Role of PALB2 Gene Variants in Breast Cancer Predictive Biology. *Annals of Clinical Genetics*.
  3. Patel, R., Kumar, S. (2021). Emerging Gene Variants and Their Role in Predicting Breast Cancer Risk. *Future Oncology Research*.
  4. Thompson, D., Ali, H. (2023). Multi-Gene Panels in Breast Cancer Prediction: Beyond BRCA. *International Journal of Molecular Epidemiology*.
  5. Clinical Herbalism Review. (2022). The Role of Herbal Medicine in Modulating Genetic Pathways in Breast Cancer. *Journal of Integrative Oncology*.
  6. https://herbalbloom.org/breast-cancers-crystal-ball-germline-genetics-illuminate-neoplasm-futures/
submitted by Herbal_Mind to HerbalBloom [link] [comments]


2024.06.01 13:57 FreshSymphony Letterboxd - Need testers for the in dev version

Hi guys,
I've done pretty much a complete rewrite of the addon and just need some people to test it please!
The tl;dr is, new version, please test and report bugs: https://letterboxd-dev.up.railway.app/configure/
Things I've changed:
For now, I have removed the "Popular" lists that were on the /configure page, but you can just pop those URLs into the input box and get those back if you'd like.
Note: if you decide to use the dev version as your primary consumer for this addon, please note it WILL go down regularly as it is bleeding edge. All bugs, all the time. 😁
I would really appreciate any feedback for this before I replace the main branch. Thank you!
submitted by FreshSymphony to StremioAddons [link] [comments]


2024.06.01 13:53 PuppetMaster000 "Unresponsive" OS button inputs?

Sorry if this is a stupid question as it is nothing game breaking, but it is mildly annoying so I wanted to check if others noticed the same thing.
Sometimes my A button (right button) wouldn't go through when pressed. After several days of testing I finally realized how to replicate it consistently and wondered if anyone else has the same problem.
This happens only in the Pocket OS menu/options, not in games it seems. If I click the d-pad, to go down/up and then quickly press A, the A input would just not register. I am talking about the regular click-click (something like 0.2 seconds apart maybe?), not same time input. If you try press down+A together, then the A input is completely ignored, which is more than likely what is happening here. But it seems like the "lockout" period where the right side buttons are ignored after pressing a direction is a bit longer than I would expect and makes the system feel not as smooth or "responsive".
For example, if I click down from "Play Cartridge" to "Library" and click A, I find myself having to click A again to enter it more often than not. Or if I am in a game through a core and click menu then go down from "Resume" to "Quit" then click A quickly, same thing. It is not exclusive to things you would expect a bit of time to load like Library or openFPGA where it may take a few seconds if you have a lot of cores or games, it is true for any option, with or without an SD card inserted. I tested on two systems and both had this (latest OS).
I tested spamming A and B to enteexit menus, no issues. I also confirmed using the button tester for Pocket that it's showing no issues on chaining the buttons together, everything registers just fine even when pressed simultaneously. If you know what I am talking about, the speed between the presses shows as 0005.
Again, this is nothing major, but wanted to see if this is normal for the OS itself? Does the menu just take a bit specifically between the d-pad and right side buttons?
submitted by PuppetMaster000 to AnaloguePocket [link] [comments]


2024.06.01 13:52 MiiQ Hexblade Warlock with randomised pact weapon inspired by Crazy Slots from HxH

I've been thinking about possible future character as my current campaign is closing to the end and started thinking of testing out Warlock properly (so far I've only played Genie Warlock in on evening oneshot).
Started creating an idea about tiefling butler that would focus on more on social/utility spells like Invisibility, Suggestion, Charm Person, Fear etc. So avoiding any direct spell that does damage as much as possible. And leave actual damage to just the hexblade/pact of the weapon side of the character.
With that, I came with the idea of having Pact of the Weapon that randomises the weapon everytime I would call for it, similar to Crazy Slots from HunterXHunter. There are some aspects that I am still hesitant on what would actually work and how to make sure it doesn't go overboard or stay lackluster. The pact of the weapon itself does already support this as the weapon with the Improved Pact Weapon, I just leave the decision up to a dice.
Basic things I've tried to think about:
So in essence, when rolling for the weapon, I get whatever the dice decides, and alongside that, my 2 feats would change to accomodate the weapon. If I have no active pact weapon, then the lineage +lvl4 feats do not "exist" in the normal situation. Same goes for the one spell, so if I have no active Pact weapon on my hand, I know one less spell than I would normally.
The 1dX could be anything depending on how many actually good ideas I can come up with for the weapons, I've also thought about that I could have the dice be 2 higher than the weapons formulated, so that one of the outcomes is that I can decide which weapon to take, and the other can be that DM decides it.
Weapon rolls I've thought about
Roll Weapon Feat 1 Feat 2 Spell
1 Glaive Great Weapon Master Polearm Darkness
2 Longbow Sharpshooter Alert Banishing Smite
3 Heavy Crossbow Crowwbow Expert Sharpshooter Branding Smite
4 Greataxe Great Weapon Master Sentinel Spirit Shroud
5 Dagger + Hand Crossbow Crossbow Expert Mobile Thunder Step
6 Sword + Shield Warcaster Shield Master Staggering Smite
7 2x Sickle Fighting Initiate: Two-weapon fighting Dual Wielder Cone of Cold
8 Trident + Net Fighting Initiate: Thrown Weapon Fighting Sharpshooter Cause Fear
9 DM gets to decide from 1-8
10 I get to decide from 1-8
My aim is that there is downside of not knowing what weapon I might get, so I could get something completely useless for the current scenario. But to balance that, the weapons would change my first 2 feat slots + 1 spell slots, so that they are still completely build around.
Current things I'm wondering about:
submitted by MiiQ to 3d6 [link] [comments]


2024.06.01 13:48 Present-Artichoke-17 First time bringing in a stray.....

Hi all!
Soooo.... i ended up bringing in a young male stray cat. I thought he belonged to someone at first because he was very trusting- laying down on his side/showing his stomach. Lets strangers pet him. I brought him in, I have another indoor cat, so I've kept the stray in my bathroom (only room that fully closes off).
I'm a little out of my element. I've had him scanned for a chip, negative. I've been doing my due diligence- posting on FB and Ring Neighbor. Hes being fed, has water, i set up the litter box in the tub which hes been using.
I brought him to the vet- he got a clean bill of health from the tests, got dewormer as a preventative, got his basic vaccines since i wasnt sure if i was able to keep him. (I want to keep him....). Hes not neutered, about maybe 2 years old. Has scars so they think hes been on the street a while.
Hes generally very sweet. Vocal lol. purring all the time, or meowing to be let out of the bathroom. he shows me his belly, i can pet him when hes eating. lets me comb him. i can pick him up. he has claws but has never tried using them on me.
I feel so guilty because my bathrooms small.... so I got gates to try to section space off and so him and my cat can get used to each other. theyve seen each other when ive gone in and out of the bathroom with no issues or hissing.
when hes in the bathroom hes layin down most of the time- sleeping, is this normal? i get so worried that hes just stress napping because the bathrooms small so its not like he can really roam in the room.... my friend whos taken in strays before said hes probably just finally relaxing not worrying about shelter or food. also because hes a stray hes more active at night maybe so sleeping more during the day is normal.
im trying to slowly introduce so he has more space to roam- i have gates sectioning off two spaces (stray just jumps them) my cat has been super patient, very curious and tries to let the stray have his space but hes a curious kitty he wants to approach more. when they see each other they dont hiss, but the stray didnt like my cat following him around and hissed then, my cat just took two steps back and sat- the stray from my understanding still seem to feel claustrophobic and moments of over stimulation. ive been letting him out in small increments like 15-20minutes with a good break inbetween because it seems overwhelming for both (ive only done this twice thus far)
im just worried because i hear him meowing to get out and clawing at the door occasionally to get out and its so discouraging that im doing this right.... it breaks my heart
does anyone have tips or tricks? im not rereleasing him... its been under a week so im not giving up on anything yet as i know this can be a weeks months long process. but would love to hear other scenarios. i have a calming collar on him which seemed to help in the beginning.
thank you!
submitted by Present-Artichoke-17 to CatAdvice [link] [comments]


2024.06.01 13:39 delliamcool She got a DUI

My sister who I’ve posted about in this subreddit before got a DUI this week. It’s her first one. I saw her the night it happened, I was over at my family’s house watching a movie with my other sister on the couch. Q sister came in and said goodnight to us and that she was going to bed and upstairs.
My sister and I woke up to her calling our phones from jail at 2 AM, but when the recording said press 1 to accept this call it would say she didn’t have any money in her account and so it couldn’t connect, so we didn’t actually get to talk to her or find out what jail she was in, and what happened until the next day. She was pulled over for reckless driving in a suburb 40 minutes away from the house (we still have no idea why she was up there) failed her sobriety test and blew a .17 on the breathalyzer. She could have killed herself or someone else. She’s out of jail and waiting to be arraigned. My mom wanted to pick her up from jail and tell her she could go directly to rehab or that she’d drive her to any drop off spot of her choosing but she could not live at the house anymore if she did not go to inpatient rehab. My sister agreed to go to rehab, but it has been impossible to find her a spot that her insurance will cover. She’s been out of jail for two days now and we still have not been able to find her a spot anywhere. So she’s just at the house watching TV. From what I googled she’s definitely going to have to pay thousands in fines and could get up to 90 days in jail in our state. I’m terrified for her, she’s a very sensitive and anxious person and I can’t imagine her being in jail.
My family’s house is old and creaky and it is impossible to come downstairs without the entire house knowing someone is on the stairs, there’s literally no way she could have snuck down them and we never saw her come downstairs to leave, which means she must have snuck out on the roof. She’s 26.
For some reason I can’t get the image of her deciding to climb onto the roof and shimmy down to sneak away out of my mind. What kind of grown adult does that? She used to pack my lunches for me when we were kids and I had a field trip. She used to drive her car close to where I exited the building after high school and wait for me so I wouldn’t have to walk around looking for her car in the line, even though this put us at the back of the line every day and meant we got home 20-30 minutes later. I feel like I don’t even know her anymore.
I’ve spent every free minute I have had over the last 3 days trying to find her a rehab spot. I have 6 college assignments due tomorrow at midnight and haven’t started a single one and my apartment is a wreck but I can’t think about anything else other than her.
submitted by delliamcool to AlAnon [link] [comments]


http://rodzice.org/