Can you get high off buspirone

Steam deals: newest deals on Steam

2010.02.19 22:22 Failcake Steam deals: newest deals on Steam

Not all of us have access to Steam every day, so it's nice to have the sales posted to Reddit. Hooray for cheap stuff!
[link]


2018.04.11 17:14 epikotaku How To Get There (Philippines)

Ask the community and get the right directions wherever you like to go: Jeepneys, buses, tricycles, trains, UVs, and more!
[link]


2013.09.25 21:21 ManWithoutModem High Quality Gifs

Welcome to HighQualityGifs, we got OC gifs here.
[link]


2024.06.01 14:25 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to computervision [link] [comments]


2024.06.01 14:25 FrontBrick8048 General Guide to Mitigating setup.app on iOS 16-17

Using some of the information from u/Alternative_Return_4, I was able to do some experimentation and get around setup.app and access some iOS apps on iOS 16-17.
To recreate this, follow these steps:
  1. On the Hello Screen, turn voiceover on (default way of doing this is by triple clicking the side button on iPhone X+).
  2. Tap the screen to select the "Hello" cursive text (when correctly doing so a big box that reaches the borders of the screen will center on it), and then use three fingers and swipe right. This will open the widgets drawer. Now turn voiceover off by triple clicking the side button again.
  3. Swipe down past the widgets to open spotlight search. You can now access Apps that setup.app hasn't blocked and some settings that it hasn't blocked.
I tested most iOS apps that come installed; here are the ones that setup.app hasn't blocked: Siri Shortcuts, Clock, Notes, and Books
submitted by FrontBrick8048 to setupapp [link] [comments]


2024.06.01 14:24 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to learnmachinelearning [link] [comments]


2024.06.01 14:23 the_korben Please help me understand composition, V-sync and tearing with Nvidia

Hi, I've been gaming on Linux with my RTX 4070 Ti for about half a year and I have been very happy with the results and about my transition from Windows. However, there is one issue that keeps popping up and I don't seem to be able to either fix or even understand the origins of this issue: occasional tearing. It's a problem that has been discussed a lot on the internet. But it's been discussed for so long over the last 5-15 years that you can find so many different proposed "solutions" and contradictory statements and unrelated sub-discussions that it gets even harder to understand what is even the cause of the problem.
And this is basically my intent here: I would like to understand the technical cause of the problem in the Linux gaming-component stack and am therefore looking for answers from people who understand some of the technical background. I would like to ask you not to divert the discussion by statements like "use Wayland" or "use AMD" or "use another distro" or something like that, because it doesn't really help in understanding the root of the problem and the nature of the proposed solutions. Thank you! In order to facilitate the discussion, I will try to be as detailed as possible in my descriptions, so sorry if this post is too long for your taste. :)

The problem
First off, a short description of the problem. I have a dual-monitor setup with a primary monitor running at 120 Hz and the secondary running at 60 Hz. I'm using GNOME with X11 via a somewhat modified version of Ubuntu 22.04. VRR is disabled. (I am currently using the 550.67 driver but 535 looks the same.)
I play many games on this machine and most of them run perfectly fine. For instance, I just played MotoGP 24, a very recent game running on the Unreal engine, and I can run it at 120 fps without any problems whatsoever. The framerate is perfectly smooth and there is no tearing. Then there are other games - the latest example I tried is Shadow of Mordor - where I can also run the game very well at 120 fps (after unlocking the framerate in the config files) but at sudden random times, bad tearing pops up, stays for about 20 seconds and then vanishes again. This is obviously very distracting.
A somewhat related problem happens with games that are internally capped to 60 fps. Most of these run perfectly fine as they should since I am running at 120 Hz. However, some of them sometimes also get these random tearing episodes or produce frame-time inconsistencies which then also leads to tearing or small stutters, which should not be the case because I have more than enough headroom to run these games at 60 fps.

The interpretation
So, given my ~30 years of PC gaming experience, this of course sounds like a synchronization problem. Under Windows - before the advent of VRR - the solution was simply to "enable V-sync in the Nvidia Control Panel". With that setting you wouldn't get any tearing anywhere because apparently the Nvidia driver could easily control the flow of frames from the frame buffers to the display. It seems that on my Linux machine, this is not the case. There is some cross-talk going on between various components or processes and I end up with parts of frames that should not be on the screen.
Now this of course immediately brings the "explicit sync" question to my mind, but my impression was that this was a Wayland topic and that X11 handles things different anyway. So maybe X11 is the ultimate root of the problem, but I don't have the technical expertise on this topic to come to that conclusion and I would still know the root cause of the effect.

The stack
In order to try and identify what is involved in this process, I am now listing everything that is involved with creating a frame and then displaying it. First off, we have the game engine. It uses APIs to create calls to draw the images and also handles some of the synchronization settings on its own with in-game V-sync settings or framecaps. Then comes DXVK or VKD3D or WineD3D and perhaps some other components of Wine/Proton that translate these draw calls to Vulkan or OpenGL. The Vulkan calls are then interpreted by the Nvidia driver which then sends the results to X11. Somewhere in between we may have the Nvidia "Composition Pipeline" that can be enabled in the display settings of the nvidia-settings application. X11, running under GNOME, then gets the frames and talks to the OS to send the images to the display. I hope this is at least a reasonably close description of what is going on.
Somewhere in the middle of that chain - at least on my machine - sits MangoHud and potentially caps frames or forces V-sync once again. And then there is also gamemode which also changes some settings for the system. We could also use Gamescope to sit in between the game and X11/Wayland - but I rarely use it and haven't found that it improves things with tearing-prone games. And then there is the SteamOverlay sitting somewhere on top as well.

The fixes
So somewhere in that setup above is where tearing is happening. To fix it, the internet gives us tons of different advice. We can "Force Composition Pipeline" or "Force Full Composition Pipeline" in the nvidia-settings. We can set "Sync to V-blank" in the OpenGL settings of the Nvidia driver. We can even enable TripleBuffering in the driver settings in the xorg config or do some even more arcane things like setting "UseNvKmsCompositionPipeline" to "false (which helps a ton with stuttering in my case) and set environment variables like __GL_SYNC_DISPLAY_DEVICE.
Then we can also enable V-sync at the DXVK or VKD3D level by modifying their configs or by using the corresponding MangoHud settings. We can also disable the system-wide compositor. We can do so many things ... and some of those things indeed make it better or worse, but I haven't found settings that can get rid of even just occasional tearing once and for all.

The confusion
This is where I am getting confused. How is it that there is no setting that simply makes it so the GPU handles V-sync? And why is it that all these fixes proposed above often contradict each other? Some people advocate for "Force Composition Pipeline", some people say the exact opposite. Some people say TripleBuffering is the solution, some people say it doesn't work. And none of it really helped.
I am also confused by which settings would override one another. For instance: would "Sync on V-blank" only affect OpenGL applications but not DXVK? Does forcing V-sync in DXVK just override the setting for V-sync in the Nvidia driver or is it something on top that handles synchronisation, and how would that interact with the "Composition Pipeline" that the Nvidia driver is using? Do V-sync settings in-game cause problems with the "Composition Pipeline" or the DXVK V-sync setting or TripleBuffering? And for that matter: what even is the "Composition Pipeline" of the Nvidia driver and how would it get rid of the tearing if the driver is then forwarding the image to X11 anyway and X11 handles the synchronization with the display?
So I guess my ultimate question is this: where exactly does the tearing come from and why would the different solutions above prevent that from happening? Ultimately, tearing is a simple thing. GPU frames are not synchronized with the display refresh rate. But why can't the Nvidia driver simply prepare a consistent image for X11 to fetch every time it asks - at least not for all games at all times? Or to put it differently, why do some settings that are supposed to facilitate this fail in some games some of the time?

The plea
As you can see, I am very confused. I tried so many different combinations of everything mentioned above and none of it has gotten rid of tearing completely. But only in some of the games. Others run perfectly fine. And as someone with a scientific/technical approach to most things, this is making me crazy. If there currently is no way of getting rid of tearing ... fine. But I would simply like to know why it works most of the time but not all of the time or where I am misunderstanding things.
So, if you understand how the graphics pipeline works on our dear Linux with X11 and Nvidia and you can explain where the tearing comes from (or how one even may get rid of it) I would be so greatful because then I could finally enjoy Shadow of Mordor without trying to get research funding from the EU or the National Science Foundation. :)

submitted by the_korben to linux_gaming [link] [comments]


2024.06.01 14:23 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66shttps://github.com/tensorflow/addons/issues/2807https://github.com/tensorflow/addons 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to deeplearning [link] [comments]


2024.06.01 14:23 curious_investr How to manage finance after being laid-off?

Background : Tier-1 engineering graduate, 12+ experience in startups. Married, 35 years old. Homemaker wife. Two kids (3y, 1y). Laid off a year ago. No steady income since then, some freelancing work. Not lot of job opportunities at my experience level.
Finance : Total Corpus = ~4.1Cr
1Cr - invested in various assets for 15 years for kids graduation and marriage.
1.6Cr - Flat where I live. 1Cr paid and 60L loan. 10y tenure remaining. 80k EMI
50L - ESOPs. Can become 2x or 0 or remain unchanged, absolute uncertain.
1.5Cr - additional, liquid not invested anywhere.
Household Expenses : 50k monthly
Goal : With market uncertain, I dont see getting job any sooner. I want to go with plan B and not depend on job to earn. I dont want to touch kids investment, can modify everything else.
Out of liquid 1.5L, I am thinking to use 50L to pay off loan to bring EMI under 10k. Remaining 1Cr, I wish to invest for a steady monthly income to take care of household expenses (50-60L in a commercial shop for rent and remaining in wint wealth bonds for ~10% return)
Would you do anything different if you were me?
submitted by curious_investr to personalfinanceindia [link] [comments]


2024.06.01 14:22 Followerbar How to Get Views On Instagram Videos?

How to Get Views On Instagram Videos?
https://preview.redd.it/b32hjexfdy3d1.png?width=727&format=png&auto=webp&s=8f48536bbc359a4ff1748b7e8cacb42290ead50e
1. Create Unique and Shareable Content
Creating unique and shareable content is key to making it on such a visual platform. If you’re making videos, ensure they are always high-quality, engaging, and valuable so people are likelier to like, comment, and share. Be cohesive with the aesthetic of the videos so your feed is consistent and always provides value.
2. Promote IGTV on Other Channels
Cross-promotion will greatly increase brand visibility, so make sure to promote your IGTV on other channels and social media platforms. Such as Facebook, Twitter, TikTok, etc. Some followers are more active on Facebook than on Instagram, and you want to reach everyone, which is what cross-promotion allows you to do. Not to mention, this will encourage people to follow you on all your social media channels.
3. Share on Your Instagram Stories
Instagram Stories have much more reach than IGTV and posts because they are easier to access. Not to mention, followers can view more content in less time than posts on the feed. Sharing your videos on your Instagram Stories will increase your exposure, and that’s how you get more views. To increase IGTV views, create eye-catching stories so viewers are interested enough to tap on them and watch the full video.
Conclusion
All the valuable tips we provided today are tried and true! If you practice them consistently, your views will grow consistently, and so will your business. If your Instagram views are not increasing, you can use our Buy Instagram Views India service to improve them. Being successful on Instagram involves many moving parts, and it’s not something you can accomplish without a marketing strategy. Use these tips to create one and watch your brand explode.
submitted by Followerbar to u/Followerbar [link] [comments]


2024.06.01 14:22 TomCropper Joint mortgage house sale support

Based in England - North West
Long story short, my partner recently ended our 6 year relationship, in which we own a property together (joint mortgage).
We are both currently still living together as she will not move out, even though she has somewhere to go and I physically have nowhere to go, plus have to finish off parts of the house before I can put it on the market. (I know she has every right to stay in the property).
The property cost £190,000 and I put the full deposit down (£28,500) as well as paid all the solicitors fees etc using funds from selling my own property prior to the purchase of this one.
I also pay 65% of all bills, in comparison to my ex paying 35% due to our wage difference.
I have done a multitude of jobs on the property and paid for all work carried out on said property. Her father is a joiner by trade and did help with a lot of the work on the property and the agreement was that this would contribute to part of her stake in the house that she could not provide financially.
Unfortunately, I did not get a deed of trust or get a tenants in common mortgage to cover the money I put into the property as I never anticipated this happening.
We had a verbal agreement that if we ever sold the house, I would take my deposit back, then split the remaining funds equally due to her father helping me. However, the house was intended to be a longer term investment and we also have around £15k of debt to pay off once the house is sold. So we may struggle to get back the deposit, pay off all debts and have much left over.
Unfortunately, my ex does not appear to be willing to honour said agreement and I know she is legally entitled to half due to it being a joint mortgage.
Is there anything I can do here?
Thank you in advance!
submitted by TomCropper to LegalAdviceUK [link] [comments]


2024.06.01 14:21 Jonasbru3m TensorFlow Model Only Predicts 2 Classes out of 475

Hello Reddit Community,
For my Bachelor Thesis im currently trying to train my first ever model with tensorflow, but I'm encountering a strange issue where my model only predicts 2 classes out of the 475 possible classes. The model was trained on a HPC with 304 Nvidia A100 and 352 Nvidia A40 GPGPUs in 82 nodes.
Thats my training script:
 import os import tensorflow as tf from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.applications import EfficientNetB7 from tensorflow.keras import layers, models from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard import tensorflow_addons as tfa import logging import json # Setup logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # Check if GPUs are available gpus = tf.config.experimental.list_physical_devices('GPU') if gpus: try: for gpu in gpus: tf.config.experimental.set_memory_growth(gpu, True) tf.config.set_visible_devices(gpus, 'GPU') logging.info(f"Using {len(gpus)} GPUs.") except RuntimeError as e: logging.error(e) else: logging.error("No GPUs found. Check your device configuration.") # Data directory data_dir = "/app/FOOD475/" # Image dimensions and batch size img_height, img_width = 600, 600 batch_size = 64 # Data preprocessing and augmentation train_datagen = ImageDataGenerator( rescale=1./255, rotation_range=40, width_shift_range=0.2, height_shift_range=0.2, shear_range=0.2, zoom_range=0.2, horizontal_flip=True, fill_mode='nearest', validation_split=0.25 ) # Load and preprocess images train_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='training' ) validation_generator = train_datagen.flow_from_directory( data_dir, target_size=(img_height, img_width), batch_size=batch_size, class_mode='categorical', subset='validation' ) # Model creation function def create_model(input_shape, num_classes): base_model = EfficientNetB7(include_top=False, input_shape=input_shape, weights='imagenet') base_model.trainable = True inputs = layers.Input(shape=input_shape) x = base_model(inputs, training=True) x = layers.GlobalAveragePooling2D()(x) outputs = layers.Dense(num_classes, activation='softmax')(x) model = models.Model(inputs, outputs) return model def find_latest_saved_model(checkpoint_dir): logging.info(f"Looking in checkpoint directory: {checkpoint_dir}") if not os.path.exists(checkpoint_dir): logging.error(f"Checkpoint directory does not exist: {checkpoint_dir}") return None, 0 subdirs = [os.path.join(checkpoint_dir, d) for d in os.listdir(checkpoint_dir) if os.path.isdir(os.path.join(checkpoint_dir, d))] if not subdirs: logging.info("No subdirectories found for checkpoints.") return None, 0 latest_subdir = max(subdirs, key=lambda x: int(os.path.basename(x))) latest_epoch = int(os.path.basename(latest_subdir)) logging.info(f"Latest model directory: {latest_subdir}, Epoch: {latest_epoch}") if os.path.exists(os.path.join(latest_subdir, 'saved_model.pb')): return latest_subdir, latest_epoch else: logging.info("No saved_model.pb found in the latest directory.") return None, 0 # Mirrored strategy for multi-GPU training strategy = tf.distribute.MirroredStrategy() with strategy.scope(): saved_model_dir = 'model_training' checkpoint_dir = os.path.join(saved_model_dir, 'checkpoints') latest_saved_model, latest_epoch = find_latest_saved_model(checkpoint_dir) if latest_saved_model: logging.info(f"Loading model from {latest_saved_model}") model = tf.keras.models.load_model(latest_saved_model) else: logging.info("No saved model found. Creating a new model.") model = create_model((img_height, img_width, 3), len(train_generator.class_indices)) if not os.path.exists(saved_model_dir): os.makedirs(saved_model_dir) summary_path = os.path.join(saved_model_dir, 'model_summary.txt') with open(summary_path, 'w') as f: model.summary(print_fn=lambda x: f.write(x + '\n')) logging.info(f"Model summary saved to {summary_path}") optimizer = tf.keras.optimizers.Adam(learning_rate=0.0002) model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy', tf.keras.metrics.TopKCategoricalAccuracy(k=5), tfa.metrics.F1Score(num_classes=len(train_generator.class_indices), average='macro')]) # Custom Callback for Saving the Best Model in SavedModel format class SaveBestModelTF(tf.keras.callbacks.Callback): def __init__(self, monitor='val_accuracy', saved_model_dir='model_training'): super(SaveBestModelTF, self).__init__() self.monitor = monitor self.saved_model_dir = saved_model_dir def on_epoch_end(self, epoch, logs=None): current = logs.get(self.monitor) if current is None: logging.warning(f"Monitor '{self.monitor}' for saving the model is not available in logs.") return logging.info(f"Epoch {epoch + 1}: saving model to {self.saved_model_dir}/checkpoints/{epoch + 1}") epoch_path = os.path.join(self.saved_model_dir, 'checkpoints', str(epoch + 1)) if not os.path.exists(epoch_path): os.makedirs(epoch_path) self.model.save(epoch_path, save_format='tf') # Callbacks for monitoring progress tensorboard_cb = TensorBoard(log_dir='./logs') # Save class indices to a JSON file class_indices_path = 'model_training/class_indices.json' if not os.path.exists(os.path.dirname(class_indices_path)): os.makedirs(os.path.dirname(class_indices_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(class_indices_path)} created.") with open(class_indices_path, 'w') as file: json.dump(train_generator.class_indices, file) logging.info(f"Class indices saved to {class_indices_path}") # Model training total_epochs = 7 model.fit( train_generator, initial_epoch=latest_epoch, # Start from the next epoch epochs=total_epochs, validation_data=validation_generator, callbacks=[SaveBestModelTF(saved_model_dir=saved_model_dir), tensorboard_cb] ) # Evaluate the model eval_result = model.evaluate(validation_generator) logging.info(f'Validation Loss: {eval_result[0]}, Validation Accuracy: {eval_result[1]}') # Save the final model as a SavedModel format (including .pb files) model.save('model_training/finished_model') logging.info("Finished model saved in SavedModel format at 'model_training/finished_model'") # Convert to TensorFlow Lite converter = tf.lite.TFLiteConverter.from_saved_model('model_training/finished_model') tflite_model = converter.convert() tflite_path = 'model_training/lite_model/trained_model_lite.tflite' if not os.path.exists(os.path.dirname(tflite_path)): os.makedirs(os.path.dirname(tflite_path), exist_ok=True) logging.info(f"Directory {os.path.dirname(tflite_path)} created.") with open(tflite_path, 'wb') as f: f.write(tflite_model) logging.info(f"Model converted and saved as {tflite_path}") 
During training i got following output:
Found 182235 images belonging to 475 classes. Found 60544 images belonging to 475 classes. Epoch 1/7 2848/2848 [==============================] - 11914s 4s/step - loss: 1.7624 - accuracy: 0.5931 - top_k_categorical_accuracy: 0.8152 - f1_score: 0.4739 - val_loss: 1.1666 - val_accuracy: 0.7043 - val_top_k_categorical_accuracy: 0.9013 - val_f1_score: 0.6053 Epoch 2/7 2848/2848 [==============================] - 11096s 4s/step - loss: 0.8293 - accuracy: 0.7788 - top_k_categorical_accuracy: 0.9435 - f1_score: 0.7094 - val_loss: 0.9409 - val_accuracy: 0.7533 - val_top_k_categorical_accuracy: 0.9277 - val_f1_score: 0.6818 Epoch 3/7 2848/2848 [==============================] - 11123s 4s/step - loss: 0.6247 - accuracy: 0.8274 - top_k_categorical_accuracy: 0.9632 - f1_score: 0.7760 - val_loss: 0.8422 - val_accuracy: 0.7761 - val_top_k_categorical_accuracy: 0.9386 - val_f1_score: 0.7080 Epoch 4/7 2848/2848 [==============================] - 11101s 4s/step - loss: 0.5070 - accuracy: 0.8562 - top_k_categorical_accuracy: 0.9743 - f1_score: 0.8165 - val_loss: 0.8002 - val_accuracy: 0.7885 - val_top_k_categorical_accuracy: 0.9428 - val_f1_score: 0.7249 Epoch 5/7 2848/2848 [==============================] - 11079s 4s/step - loss: 0.4261 - accuracy: 0.8766 - top_k_categorical_accuracy: 0.9814 - f1_score: 0.8445 - val_loss: 0.7757 - val_accuracy: 0.7940 - val_top_k_categorical_accuracy: 0.9458 - val_f1_score: 0.7404 Epoch 6/7 2848/2848 [==============================] - 11100s 4s/step - loss: 0.3641 - accuracy: 0.8932 - top_k_categorical_accuracy: 0.9856 - f1_score: 0.8657 - val_loss: 0.7639 - val_accuracy: 0.8003 - val_top_k_categorical_accuracy: 0.9472 - val_f1_score: 0.7432 Epoch 7/7 2848/2848 [==============================] - 11129s 4s/step - loss: 0.3142 - accuracy: 0.9068 - top_k_categorical_accuracy: 0.9889 - f1_score: 0.8838 - val_loss: 0.7701 - val_accuracy: 0.8014 - val_top_k_categorical_accuracy: 0.9470 - val_f1_score: 0.7474 946/946 [==============================] - 2671s 3s/step - loss: 0.7682 - accuracy: 0.8008 - top_k_categorical_accuracy: 0.9470 - f1_score: 0.7456 
And when I try to load the model and make a prediction with this code:
class own: def __init__(self): if not os.path.exists("models/own"): raise FileNotFoundError(f"Model path models/own does not exist") try: self.model = tf.keras.models.load_model("models/own", custom_objects={'F1Score': F1Score}) except Exception as e: print(f"Error loading model: {e}") raise if not os.path.exists("models/own/class_indices.json"): raise FileNotFoundError(f"Class indices path models/own/class_indices.json does not exist") with open("models/own/class_indices.json", 'r') as file: self.class_indices = json.load(file) self.index_to_class = {v: k for k, v in self.class_indices.items()} def classify(self, img_path): if not os.path.exists(img_path): raise FileNotFoundError(f"Image path {img_path} does not exist") # Load and preprocess the image img = tf.keras.preprocessing.image.load_img(img_path, target_size=(600, 600)) img_array = tf.keras.preprocessing.image.img_to_array(img) img_array = np.expand_dims(img_array, axis=0) img_array /= 255.0 # Make prediction predictions = self.model.predict(img_array) print("Raw predictions:", predictions) top_index = np.argmax(predictions[0]) top_class = self.index_to_class[top_index] print(f"Top class: {top_class}, Probability: {predictions[0][top_index]}") top_n = 5 top_indices = np.argsort(predictions[0])[-top_n:][::-1] for idx in top_indices: print(f"Class: {self.index_to_class[idx]}, Probability: {predictions[0][idx]}") return top_class 
it always either predicts Steak or Omelette:
2024-06-01 14:17:27.571776: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead. C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\tfa_eol_msg.py:23: UserWarning: TensorFlow Addons (TFA) has ended development and introduction of new features. TFA has entered a minimal maintenance and release mode until a planned end of life in May 2024. Please modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). For more information see: https://github.com/tensorflow/addons/issues/2807 warnings.warn( C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\tensorflow_addons\utils\ensure_tf_install.py:53: UserWarning: Tensorflow Addons supports using Python ops for all Tensorflow versions above or equal to 2.12.0 and strictly below 2.15.0 (nightly versions are not supported). The versions of TensorFlow you are currently using is 2.15.0 and is not supported. Some things might work, some things might not. If you were to encounter a bug, do not file an issue. If you want to make sure you're using a tested and supported configuration, either change the TensorFlow version or the TensorFlow Addons's version. You can find the compatibility matrix in TensorFlow Addon's readme: https://github.com/tensorflow/addons warnings.warn( WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\saving\legacy\saved_model\load.py:107: The name tf.gfile.Exists is deprecated. Please use tf.io.gfile.exists instead. 2024-06-01 14:17:31.363666: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX2 AVX512F AVX512_VNNI AVX512_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\engine\functional.py:156: The name tf.executing_eagerly_outside_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outside_functions instead. WARNING:tensorflow:From C:\Users\[Name]\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\src\layers\normalization\batch_normalization.py:979: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead. 1/1 [==============================] - 4s 4s/step Raw predictions: [[4.23421043e-05 1.45377373e-06 1.09034730e-02 1.19525917e-04 4.45407240e-05 5.72818244e-05 5.68609731e-03 5.15926695e-05 1.89958355e-05 1.39491487e-04 3.20717366e-03 9.63417915e-06 1.22947793e-03 4.01171012e-04 3.64649204e-05 1.75396308e-05 3.09416023e-03 7.56465085e-03 2.89075997e-05 3.90331191e-03 2.16231216e-03 4.18351328e-06 5.89632022e-04 9.40740295e-03 6.80321036e-03 2.32697069e-03 4.23964392e-03 1.56047070e-04 2.14435873e-04 6.95710623e-05 1.38103365e-04 1.78470847e-03 3.75193194e-03 5.94434096e-03 5.69255608e-05 7.57165905e-03 1.52613886e-03 9.48755944e-04 8.21925176e-04 3.18029453e-03 3.89393512e-03 8.41296278e-05 8.34997976e-04 3.14124190e-04 6.81638776e-04 1.10320523e-02 1.10815199e-04 6.18589204e-03 2.17406079e-02 3.72037102e-05 1.65579877e-05 1.30886221e-02 1.01435784e-04 2.13157946e-05 1.25499619e-05 8.94762017e-03 4.36880719e-03 4.78018774e-03 8.53170827e-03 1.45823974e-02 1.05571962e-05 1.12631078e-05 5.09415939e-03 8.12840741e-03 1.48212257e-05 1.52864438e-02 9.66716034e-05 2.25000476e-04 3.60531732e-04 9.28066402e-06 8.15156789e-04 1.09069003e-02 3.43796797e-04 2.53324561e-05 7.89516326e-03 1.44943051e-05 4.06841224e-04 1.67445414e-05 3.78527766e-05 1.80476491e-04 3.33699776e-04 4.13847056e-06 3.32273915e-03 6.51864940e-03 7.48403618e-05 2.68448726e-04 1.54245936e-03 2.95383972e-03 2.26996126e-05 3.64100002e-03 2.81597768e-05 3.11967051e-05 1.48438021e-05 8.46863433e-04 4.05767525e-04 1.75380992e-04 4.76581818e-06 5.42160356e-04 2.19287374e-03 1.18714366e-02 1.41884899e-04 8.76697595e-06 3.85931274e-03 4.37544841e-05 4.01919424e-05 3.87528981e-03 3.88057524e-05 2.69062322e-04 4.46968805e-03 1.17368818e-05 3.70194939e-05 1.55831876e-04 1.63894765e-05 2.38729117e-04 1.19046052e-03 2.12675819e-04 1.08185853e-03 3.01667496e-05 6.18575094e-03 3.91955400e-05 1.40065713e-05 3.02084809e-04 6.46927813e-03 3.37069832e-05 5.15250103e-05 2.31142567e-05 2.20274273e-03 3.17445702e-05 1.04452763e-02 6.80019803e-05 7.81101780e-03 1.23853814e-02 1.04819983e-02 3.20679283e-05 6.71340758e-03 6.94293885e-06 1.98310101e-03 5.29599565e-05 9.02036484e-03 4.57535089e-06 1.93145883e-03 4.06190008e-03 8.42716638e-03 1.50314684e-03 8.58115556e-04 1.22383237e-03 8.49474862e-04 5.48258470e-03 6.09953167e-05 1.57669128e-03 5.43692382e-03 4.88058169e-04 6.75312986e-05 3.43937165e-04 1.93276245e-03 4.06867871e-03 5.20323374e-05 7.78318281e-05 1.93508764e-04 1.14409677e-05 2.21324177e-03 1.90052821e-03 8.52691382e-03 2.43102224e-03 2.88419239e-03 2.53974522e-05 9.51182563e-04 2.32981285e-03 9.86064842e-05 4.14316915e-03 1.66544644e-03 1.02754391e-04 3.95776224e-05 3.02393187e-06 1.32082617e-02 4.14707232e-04 3.40229672e-05 4.81802830e-03 1.90598912e-05 4.08358377e-04 5.95443300e-04 1.22634810e-04 5.74091624e-04 8.57623760e-03 2.60962266e-03 2.95263715e-03 1.58088005e-05 1.64122172e-02 2.09987498e-04 2.36775051e-03 3.00696083e-05 3.46693669e-05 1.16249910e-04 6.94001559e-03 1.58400853e-05 1.95188422e-05 2.19169408e-04 3.09433235e-04 5.44128183e-04 6.35302160e-04 7.07127433e-03 1.19772732e-04 5.37439200e-06 1.91133395e-02 1.27979312e-02 3.89739592e-03 1.97048103e-05 2.29625002e-05 2.21050854e-04 1.92064399e-04 1.20139657e-05 3.20516920e-05 4.26828819e-06 3.64828011e-05 7.55213068e-06 2.67963973e-03 3.17923805e-05 6.19895945e-05 3.99544797e-06 2.68664648e-04 1.83274597e-02 8.71072552e-05 1.38439747e-04 4.96710254e-06 3.56023484e-05 1.34899991e-03 2.05766381e-04 3.96062108e-03 5.61600551e-03 5.31910664e-05 6.77773132e-05 1.36139952e-02 7.41477634e-05 1.63904135e-03 4.74587978e-06 1.45082246e-04 2.09337009e-06 8.13181920e-04 3.63194500e-04 6.46722084e-03 5.02364383e-05 6.90550078e-05 6.36972545e-05 2.09673337e-04 1.79036579e-05 2.36021675e-04 6.37291942e-06 5.70875318e-06 2.56235455e-03 2.72009202e-04 3.77103061e-05 5.63449021e-06 2.25979857e-05 2.61697169e-05 3.42375762e-03 1.04161156e-02 2.22223607e-05 6.27681802e-05 1.88465419e-04 2.82149922e-05 4.01149562e-04 1.31122259e-04 5.97863036e-05 2.41098423e-05 7.71318519e-05 3.57087993e-04 3.41462255e-05 1.01930054e-04 5.23206063e-06 2.95026781e-04 7.02897159e-05 3.99115682e-02 1.89455808e-03 1.74146010e-06 1.14775894e-05 7.84916210e-06 1.93041191e-03 2.37918808e-03 3.49449110e-03 6.98623667e-03 7.64393993e-03 4.12582303e-05 1.24030013e-03 1.72785169e-03 7.18316660e-05 5.17749111e-04 7.84919783e-03 1.04525541e-04 9.83856899e-06 8.77521088e-05 1.68125369e-02 4.09213862e-05 1.09552668e-04 2.54421811e-05 4.65482954e-05 6.95294410e-04 6.72869501e-05 2.40904570e-04 2.15112406e-04 3.85226776e-05 2.51369456e-05 4.68338234e-03 1.26862462e-04 9.00995801e-04 4.16984549e-05 7.36891707e-06 1.51534463e-04 1.48332631e-03 4.95935837e-03 1.91499032e-02 3.01804044e-04 6.28613270e-05 4.78365598e-03 8.38827982e-05 1.70516931e-02 1.52653758e-03 5.85798814e-04 3.11521399e-05 2.11968741e-04 7.41351105e-05 1.40834545e-05 8.93215940e-04 1.45371505e-05 4.96711982e-05 4.11317131e-04 8.89070239e-03 5.06997202e-03 3.08362325e-03 2.77415646e-04 3.75299685e-04 1.19906381e-05 1.50029315e-03 1.14443043e-04 2.52026439e-05 9.22407198e-04 3.51146841e-03 1.11564566e-06 1.36691102e-04 3.53032886e-03 2.15746608e-04 8.79282816e-05 4.36248304e-03 1.77966576e-04 1.47887832e-03 6.94399816e-04 8.03673174e-04 5.23004041e-04 3.90421192e-04 1.06344873e-03 3.55399796e-04 6.01265463e-04 1.55850008e-04 1.33491016e-03 1.09734829e-04 4.38019342e-04 2.42487862e-04 6.84730615e-03 1.02040754e-03 1.07652310e-03 3.51822848e-04 9.20735547e-05 7.50967592e-04 1.44127226e-02 3.58455327e-05 5.16555374e-05 1.31370616e-03 9.02966480e-04 1.24254671e-03 5.20300702e-04 8.57163919e-04 3.66344648e-05 2.01024144e-04 6.52487564e-04 5.93215809e-04 5.76604251e-03 6.19325438e-04 1.16480421e-03 2.37531040e-05 2.50119111e-03 7.08868974e-05 5.99786472e-05 2.55976247e-05 4.62695534e-05 4.24469297e-04 6.20667648e-04 4.15926515e-05 7.03983005e-06 8.77018738e-06 5.21141301e-05 2.11411956e-04 7.74205779e-04 5.31276630e-04 6.44316664e-04 4.07212786e-03 2.68336060e-03 1.74210854e-05 3.76385942e-05 6.74255705e-03 4.46323538e-05 2.76757801e-05 2.56290223e-04 1.22213329e-04 1.22734054e-03 7.73016480e-04 1.11903930e-02 3.16570923e-02 2.75775470e-04 5.73344238e-04 2.86890985e-03 1.10085262e-03 1.35615155e-05 2.66479654e-03 1.99418981e-03 4.31017601e-04 9.68350447e-04 3.51598108e-04 8.54862970e-04 3.52715979e-05 1.46333405e-04 5.10955288e-05 1.48639630e-03 1.80458324e-03 7.51840998e-05 1.13529910e-04 3.89828119e-06 8.74532212e-04 1.12358983e-04 3.93593837e-05 6.01037289e-04 2.06997487e-04 3.94766452e-03 1.09549124e-04 2.11403880e-04 6.95336203e-04 5.99777419e-03 5.45272342e-05 2.56420486e-03 2.20299728e-04 4.23851707e-05 6.69996080e-04 2.66609713e-04 1.55276459e-04 2.75739990e-02 3.43240798e-03 2.68303775e-05 1.52821158e-04 9.82575657e-05 4.00313947e-05 6.07266993e-05 5.28094570e-05 1.02948405e-04 6.20577412e-05 2.12161940e-05 2.99842539e-03 1.17558768e-04 1.58015324e-03 3.30074807e-04 1.19093776e-04 2.52985101e-05 1.59350988e-02 4.89539379e-05 1.05491054e-05 1.09012712e-04 2.97089737e-05 7.28885690e-03 1.87386977e-05 1.85028894e-05 5.79945299e-05 1.54079917e-05 9.85169099e-05 1.05076749e-03 7.55816349e-04 2.62255053e-05 1.18091421e-05 2.95209320e-05]] Top class: omelette, Probability: 0.03991156816482544 Class: omelette, Probability: 0.03991156816482544 Class: steak, Probability: 0.03165709227323532 Class: tacos, Probability: 0.027573999017477036 Class: breakfast_burrito, Probability: 0.021740607917308807 Class: pulled_pork_sandwich, Probability: 0.01914990320801735 (own): omelette - 3.66s 
Help would be appreciated because im slowly losing my mind :(,
Jonas
submitted by Jonasbru3m to tensorflow [link] [comments]


2024.06.01 14:20 TwoFacedComedian [online] [5E] [EST] beginner looking for a game

Hey guys, I’ve been invited to a few games recently and unfortunately none of them have worked due to me losing my job. I would really like to get something set into motion, I’ll even take the day off of work so we can play (I got a job serving). I’m a beginner and have been wanting to play for years. I have two sets of dice that have never been battle tested lol. Please let me know if you have a spot in your group or want to start a new group together. Friendly to all🤍
submitted by TwoFacedComedian to lfg [link] [comments]


2024.06.01 14:20 efthegreat I saw a post about The Witcher and it got me thinking...

Earlier I saw a post in The Witcher(highly recommended btw) sub discussing the origin of elves, since in the book an elf mockingly said that humans evolved from monkeys, so the question was if we're monkeys then where did elves come from? I did some thinking and compared the many parallels between our world and the world of The Witcher, and made a conclusion. So here's my answer, hopefully it's some food for thought, enjoy!
My understanding is that although biased he might be, there could be some truths in his words.
Look at the real world, if you believe the existence of aliens and do some research about the various types of humanoid aliens, you'll probably find some alien races that look somewhat like us, except far more beautiful of course, and technologically or in the case of Witcher, magically advanced than us.
Why did I bring up magic? Well, you don't exactly need the high tech we have on earth as there are magic energy on that planet that will do the trick. Who needs a gun when you can shoot lightning from your hands, no need for airplane when skilled mages can open a portal to a different planet with a simple wave of their hand, you get the idea.
Back to our world, the humanoid aliens that likely exists in our world, like the Nordics or the Pleiadeans, don't look that different from us except a littler taller, with slightly different facial features and some of them having pointed ears from some of the portraits that I've seen, hey, just like elves right? However, similar in appearance we might be, humans on earth do not share a common ancestry with the Pleiadeans, we originated and evolved here, and they originated there, light years from us.
Whether they evolved naturally or were "created by gods", aka high dimensional beings that likely exists in incorporeal forms, through means we cannot comprehend, is a different topic.
If we do meet and interact with them in the future, be that diplomatically, or if we're somehow sucked into their world or vice versa, Conjunction of Sphere style, it would be near impossible for us to compete with them on every level except birthrates, and they likely will not see us as their equals, as is the attitude of elves towards humans in Witcher: barbarians from a different planet.
Maybe this is exactly what it is between humans and elves in The Witcher, the elves to humans are what Pleiadeans are to us.
submitted by efthegreat to starseeds [link] [comments]


2024.06.01 14:20 ALittleImprovement Psychology notes

Hello, as a M24 student with predicted 7 in psychology SL, I was forced to self study the whole psychology curriculum as my teacher’s lesson consisted of her reading off of pre prepared presentation slides and not being able to answer any of the students’ questions. This resulted in me making a ton of notes on each topic that I would go back to every now and then and add the new things I would learn. I have a consize and complete set of handwritten notes on each of the three core topics and the developmental psychology option that include the theoretical concept in depth with hardly used facts that could „impress the examiner to get you a high grade”, with critical thinking points (also limitations, strengths and applications) and a couple of studies described. I also have notes done on the printed copies of the original studies highlighting critical thinking points and key details mentioned by the researchers. I have also created digital flashcards summarizing each of the SAQs topics with theoretical background and a description of 1 study (including details about the participants and percentage results apart from the general required outline). I also have quizlets focused on remembering the details from the studies listed in the flashcards like names, dates, the area of the brain that was influenced, the name of the specific inhibitor etc. I will be selling the notes now that I am done with the program. I can sell them in chunks - just the notes, just the flashcards, by topic etc. DM me if you’re interested and tell me how much you would be willing to pay :))
submitted by ALittleImprovement to IBO [link] [comments]


2024.06.01 14:20 Polypedatess Is this even bad enough to have ptsd from

I'm just so tired all the time, it literally feels like I can sleep all day. I have a normal sleep schedule, but everyday I just feel so exhausted. I have dark circles under my eyes and I have no energy to do anything anymore. I just lay in bed all day and want to rot. I feel suicidal, I just want to die all the time and it's getting worse. I get nightmares of him, not of what exactly happened but just of different sa from him. I feel like there's no point in going on anymore, I don't think it's going to get better. I don't exactly know what it's like to have a flashback, but I think I've experienced them. I have really bad maladaptive daydreaming, but I don't think it's that. It's like I'm there again, I can't control it or stop it or rewind it. It's like it's happening all over again and that I'm there and I can feel it. When it's happening I just sit there and cry and I feel like screaming but I obviously can't do that so I have to hold it in. My head feels like it's burning constantly too, like the back of my head feels so fucking warm and hot. Like my brain is melting. And I just want to die and I'm so tired I just want to sleep and never wake up again.
•The one big thing that makes me feel valid is that, when I was 11, my stepdad fingered me in my bedroom. I won't go in to too much detail or anything, it's unimportant. But the entire time he just stared at me and everything was silent, like he was waiting for my reaction. Our relationship has always been odd, so I wanted it. But eventually I got scared and told him something, I don't remember what it was but it got him to stop immediately and he apologised too. I don't remember much after, as in I don't know if he left my room or I left first, but I immediately went to the bathroom. Which was when I discovered I was bleeding.
•Around this time, for some strange reason I would repeatedly say to him "fuck me daddy." This would either be in person, or over messages. I remember once, when I was in school, I messaged him that. He told me to stop in case one of my friends saw. I don't know why he didn't tell me to stop for other reasons.
•One day, after telling him that in person, we were in my parents bedroom. I was sat on his bed and he was in front of me in his weird chair. He then started going in to detail about how I wanted him to fuck me, I can't remember exactly what he said, it was like I zoned out. Everytime I try to recall it now it literally feels like bugs start to crawl up me, I don't understand why. I remember the last part, and his really disgusting hushed and gentle voice. He asked if I wanted him to "cum inside of me", or he was just explaining how that would finish. I'm not really sure.
•Still around this same time period of me being 11-12, I would ask him to 'squish me.' The reason why we would call it that is because I would be on my back, my legs would be up all the way to where my head is and he would be on top of me in a way that would 'squish me'. Basically like that one sex position. I would usually be wearing my school uniform when that would happen, so a skirt. During the 'squishing', he would push down on me, so our crotches would basically be against eachother. I don't know why, but I would continuously ask him to 'squish me' and during it I would even say the whole "fuck me daddy" thing. Only recently have I realised that he was probably just pretending to fuck me.
•Other things had happened around that age too, like how we would talk about how many times we masturbated a day and compare it to eachother. Sometimes if I was abruptly going to my room, he would ask if I was going to go masturbate, since we were 'close like that' I would tell him. He would often recommend me NSFW Instagram model accounts. I was once tricked in to sending feet pics to this guy, which really isn't that serious and whenever I brought it up with friends they find it fucking hilarious. But the detail I always leave out is that, I did bring that up with my stepdad and he proceeded to tell me that he already knew. Which means he was spying on me through the crack of the door. If that already didn't bother me, I don't understand why he just allowed me to send those pictures, if he was watching why the hell didn't he stop me?
•I'm pretty sure this also happened around the age of 11 as well, recently, a memory resurfaced but I barely remember it. Basically, I was sucking on his neck. I don't remember who said it, but either him or my mum spoke up and laughed, saying that I needed to stop otherwise I would "give him a hickey." The reason why I wouldn't be surprised if my mum was in the room at the time is because she doesn't care about what he does. She knows everything and just doesn't fucking care.
•I'm very sure that, around that age, my parents begun to expose me to their loud sex. I wouldn't be surprised if it started even younger, however. Obviously, I tried to bring it up with them at the ripe old age of 11 and my mum immediately shot me down with a "it's natural." This only stopped recently, around this year, because I had a big panic attack over hearing them and my mum finally felt guilty. I started getting panic attacks over it the minute it started, maybe the panic attacks were a sign of the trauma when I was younger, but I'm convinced it is now. I heard it so many times that I began to get paranoid every night, I would start to hear it even if they weren't upstairs (I sound crazy, I know.) I would get so anxious every night in case I would hear it, to the point I started to really resent them from it. I know fine well I could just go to sleep before them, but sometimes they even woke me up with it, on numerous occasions.
•I'm convinced my stepdad wanted me to hear it. Around the time of it finally stopping, I got mad because i was hearing it again (I'm unsure if it was due to me hearing shit or they actually were) but it caused me to take my bedding and go downstairs to sleep. In the morning, I was rudely awoken to my stepdad slamming the door open and storming past. He's not usually like that when people are sleeping, so it instantly gave me the impression that he was pissed off and the only reason I can think of is that he was angry I wasn't there to listen.
•He used to tease me for my paranoia to. As a way to discourage them from getting intimate, I would leave my door open at night. This happened around this year, but I was doing that again and I messaged my stepdad if they were actually going to sleep. It then somehow turned to him making a dig about how he knew I gets anxious at night and when I asked why he sent me "In case me and your mam have sex. 😜" Before, I tried to resolve this issue by begging them to just tell me if they were gonna have sex or not so I could sleep downstairs (because I was gonna find out the hard way anyways.) And they kept on refusing? Which just gave me the impression that they wanted me to listen more.
•Around 11 again, he would often tell me details about his and my mums sex life. Like how he was always good at pulling out and the only time he would wear a condom is right when he was about to finish. But the reason why my sister came to be was because he just failed to pull out that one time and my mum refused to get an abortion. Another time, he went on about how him and my mother had sex during her period and how they had to use towels and they didn't enjoy it because it was too messy.
•I don't know if he did things before the age of 11, my memories are very faded and it's like there are major gaps throughout everything. I'm worried that he did, however. When I was very young, I remember having no accidents at all during the night. But then, around the ages of 9, I would have an accident basically every night and would get a lot of water infections. I know that's a classic sign of child sexual abuse, but I don't want to jump to conclusions or anything.
•Another reason as to why I believe more things had happened to me than what I know of is because I always seemed to know what sex was when I was young, but I wouldn't know the name or anything specific about it like how to get pregnant or what cum was. Though, even though I didn't know what it was, it was like I always thought about it, I could never not think about sex, it was disgusting. This stayed until I was around 13. I remember where I even asked my 'boyfriend' at the time, we were both around 8, if he wanted to have sex, and I have no idea why.
•Over the years, he would flash me frequently. Everytime, I would always believe it was an accident because he'd never acknowledge it, besides from that one time which he always jokes about it and blames me. Everytime he would flash me, it would either be because of a convenient hole in the crotch of his pants or because he was wearing very lose fit shorts and it would just be hanging out. The more I think about it, I'm very sure he would have been able to feel such a thing, especially when it was poking out of the hole, but it was like he was just oblivious.
•For some strange reason, when I was younger, I would make comments about small dicks. I don't know if I was commenting on his dick specifically, but he would always say the same thing. "Width matters more than length."
•Recently, around 16-17, he made a joke about how he listens to me masturbating. Once he noticed how shocked I looked, he then went on saying about how my vibrator is too quiet to hear.
•Around 17 again, I went to use the shower. The shower I use is the one that's connected to my parents room. When I locked the door, he got madish and started making comments about it. I had to defend myself, saying how 'the door would open on it's own if I didn't lock it'. Eventually, he backed off.
•I don't understand the point in the fucking door and lock to my bedroom anymore. Whenever I decided to lock my door, my parents start shouting at me through the walls, asking why I locked my door. My stepdad barely knocks, it's like a tap and he doesn't even wait sometimes. I remember seeing a past message from an old friend saying how he tried to walk in when I was changing and that he knew I was changing. I didn't explain myself, I really wish I did because I don't remember this.
•(Around 17.) We were messaging eachother and it somehow turned in to him hinting if I saw this one animated video, it was a porn one. I said no, and to that he sent me a screenshot of it. It wasn't anything bad or anything, just the start of it and nothing was revealing, he then asked if I was sure. And how he was surprised that I hadn't.
•(Around 17.) I don't really get my period, we still don't know why. But as I was getting a lot of blood tests, my stepdad was trying to check things off the list of what it could be. One of those being that my opening is just extremely tight I guess, because he asked if I ever tried penetrating myself. I admitted that I did, but I couldn't get it to exactly go in. Which he then decided to make a comment saying how It's just my 'technique'. I wonder if the only reason he asked that was to see if I ever tried anything out of morbid curiosity.
•(Around 17 again.) He randomly bought me dildo's once, I didn't ask him for them, he just bought them for me and it was wildly uncomfortable. Once he gave me them, he asked if I wanted him to show me how to use them. I said no, which he then said something about how if I ever did then I could ask him. I worry what would have happened if I did say yes.
•When I was around 14, I went glamping. I ended up having to share a bed with him. One of the nights, I woke up to his hand just on top my crotch. I tried grabbing it and moving it away but it just fell back down on to it. I don't know if he put it back there on purpose. I still question if it was a dream, I'm very sure it wasn't because I remember going back to sleep, but it still just bugs me.
•Around 17, I was upset for some reason and he was comforting me. During this, he randomly grabbed the inside of my thigh. I usually just wear a shirt and boxers, so he basically just grabbed my naked thigh but I don't know if he was doing it in a comforting way.
•Usually when I draw, I have my knees up to my chest so it's easier to use my tablet. Considering what I wear for pyjamas, I can always see him looking at my crotch when he comes in to my room. If he really can see everything I don't understand why he doesn't just tell me to put my legs down.
•He's made a lot of uncomfortable jokes over the years too. One of the ones that upsets me sometimes is that, when he was measuring me for a binder, I was constantly moving around because it was uncomfortable since I was just in a sports bra. As he was leaving, I think I told him about how it was uncomfortable for me or something along those lines. He then turned around and shouted "oh come on, it's not like i was fingerings your pussy or anything."
•Very recently, I asked him if I looked okay before going to college. After a bit of back and fourth he said "I wouldn't kick you out of bed, maybe you could find someone in college who would do the same."
•Other times when I asked him if I looked okay, he'd go on tangents about how my ass is great or how he would date me or be too nervous to talk to me if he was my age.
•One of the more recent jokes was when I dropped a mayonnaise lid on my lap. Nothing got on me, but my stepdad turned to me then turned to my mum and shouted "if anyone starts accusing us, just tell them it was mayonnaise!" Or something like that.
•I remember after we watched the new mean girls film, he started going on saying about how he wanted to rewatch it for the Halloween seen (if you know you know) for the 'panty action'. Which rubs me the wrong way because I'm very sure the girls are supposed to be around my age.
•I'm very sure he also made this fake account, pretending to be one of my old groomers that I tried to cut off, just to message me about nsfw topics and ask for pics. It's a whole long yap about paranoia and just suspicions so I won't get into it though. If I tried to provide all the evidence I have, it'll take forever and there's no point.
There's definitely way more things that he's said, joked and done. But I'm only now beginning to realise that they're not okay. Even when I was younger, I was sort of uncomfortable around the jokes so I would just zone out, leading me to not remembering them now.
I probably will never accept that what happened to me was bad, or a big issue. Especially due to the 'lovely' people on here. Thank you for telling me immediately that I was a liar before you even knew what happened, that I shouldn't blame an 'innocent man', that you hope he comes in and rapes me to the point I split open and bleed. Thank you for telling me that my parents were just trying to promote a sex positive household, that some of the things were questionable at most. Thank you so much for saying I deserved it because I didn't send you pictures. You all made me feel like shit and I'm probably never going to tell people in person what happened to me, out of fear I would be ridiculed due to how much of a baby I'm being. I wasn't raped, so I have no place to cry or even think about it. I'm being overdramatic.
If you even read to this point, you're an angel.
submitted by Polypedatess to abusesurvivors [link] [comments]


2024.06.01 14:20 Viva-La-Vita Who else was sitting on top of a mountain of gold and now feels the fool ?

I was expecting all my gold to be waiting for me again when I logged in a year later , so I could buy new characters when they drop. Since it was so hard to earn additional gold at most points in the Beta , that was a lot of playtime.
I suppose it was partly my fault for not expecting them to pull off something like this.
They kept on upping the ante on how far they could push the player in terms of monetisation and time wasting/grinding they get away with. ( I remember the last battle pass , they had to create more short cuts towards the end to give players more of a chance to complete it with devoting every waking moment before it ended. Even with boosts it was crazy. )
I can't remember how much gold I had exactly , But I think I could have got the Fred Ascot Badge and still have a bunch left over too.
Now feel real cheated , I didn't spend all that gold on most of the roster instead or the Fred Ascot.
The exclusive cosmetics they give in compensation to losing all your gold , simply does not compare with the new characters I should of bought or the Fred Ascot I could have afforded. I doubt I'm even use those cosmetics.
Especially now that they've also limited/capped the amount methods/sources of currency you can get to buy new fighters too. Especially since you get nothing for playing matches , at least in the Beta you got little drip feeds of gold.
Feel like a complete rugpull / knife twist in the back to those who were sitting on a mountain of gold , anticipating all the leaked character releases. Killed my hype by a large margin.
But just as well really , since the gameplay feels so off now , it somehow softens the impact of losing all the gold , because it's now questionable if I'm going to continue playing the game anyway , if the gameplay doesn't at least return to somewhat resemble the gameplay of the Beta by some degree.
submitted by Viva-La-Vita to MultiVersusTheGame [link] [comments]


2024.06.01 14:20 notsostupidman Should I Continue Reading Stephen King? If So, What Should I Start With?

Note: There will be huge spoilers for The Shining movie so if you haven't watched it, you shouldn't be reading this post.
Stephen King is the biggest name in horror fiction and is one of the most popular and known authors in general. I've always wanted to read him but got confused on where to begin and ended up reading something else. I don't even know if I would like any of his books in the first place. As if now, here's my SK experience:
Books: The first SK book I ever tried was The Dead Zone, a book I got as a gift from a friend. I started reading it decades ago and for reasons I don't recall, I DNF'd it, intending to come back later. Years later, I decided to give King another try and picked up Carrie because I read somewhere that you should read him in publishing order. I would have loved Carrie, honestly, but the constant interruptions to the story like the interviews and stuff from the future turned the book from something I would have liked to something I slightly dislike. The pacing was great and the plot was interesting enough but the interruptions killed it for me. With my next attempt, I decided to pick up Gerald's Game because of the Netflix adaptation coming out. It didn't take long for me to DNF yet another book. This time, it was because I wasn't a fan of the concept and yet I had foolishly bought it without knowing anything about it: a woman trapped alone with her husband's dead body, having delusions and going insane is NOT something I find the least bit interesting. Recently, I bought IT because the premise sounded interesting, but I hadn't really started it yet when somebody told me that IT connects to the wider SK universe and I should read some other books that they named first.
Movies: I don't usually watch movies and whenever I do, I'm almost always neutral about the experience. I don't really have strong feelings about movies but I did have quite a lot of them for the SK movies I watched. I have watched The Shawshank Redemption, It Part 1, The Shining, Misery and 1408. The Shawshank Redemption, It and Misery were all excellent and 1408 is quite possibly my favourite horror movie of all time......and I fucking hate The Shining with all the cells in my body. I was most excited for The Shining because of the premise and things I had heard about it and it was a HUGE let-down. I had heard that it was going to be a character focused story on Jack Torrence as he tries to hold on to his sanity amidst the isolation and monotony of the hotel. Only, Jack Torrence isn't ever actually a good guy and there is no doubt from the first scene that he is going to go insane. The 'character work' is laughably bad and Jack almost felt like a caricature. I convinced myself that The Shining is one of King's earlier books so it makes sense if it doesn't have that good of a character work but that wasn't my only problem with the story. The cook, hilariously named Dick, is just a plot device to give some advice to Danny and to get the family a means of escape. We follow him as he slowly arrives at the hotel and immediately gets killed off. It became just another ghost story when the Grady guy helped Jack escape the freezer when I thought shit like that wasn't supposed to happen in this movie. Also Ulman explicitly states that the guy who murdered his family was Charles Grady and then the name gets changed to Delbert Grady and it isn't ever explained. We get no backstory of 237 or any explanation for the bathtub lady and the only scene that I liked in the whole movie was the one where Danny paints 'redrum' on the door and you see it spells 'murder' backwards. Sorry for the rant but I had to get this off my chest.
That's the extent of my whole SK experience. Not really liked Carrie and never finished the other two. I loved most of his movies so I feel that he might yet be an author that I can love reading but The Shining, one of his most popular works was a disappointment. I'm not really sure if I should begin It or not or what I'm supposed to read if not It. I don't think the publication order is working for me since I didn't like Carrie so any help would be appreciated. Is Stephen King just not for me or did I just read those books of his that I would not have liked anyway?
If I should read SK, what order should I read them in? Any good recommendations? And please no space/aliens related books, sci-fi since I'm not a big sci-fi fan and please, not Cujo since I don't like the concept. I love good character work, good dialogue, creative horror and am a sucker for characters having a dark ending so that should be part of the consideration. I also don't mind reading big behemoth books or just very slow paced books since LotR and ASOIAF are some of my favourite series and I'm just about to finish Wheel of Time.
Thoughts?
ETA: I just realised that there must be thousands of posts like this in this sub so I'm sorry to be wasting your time with this repetitive shit you guys must be dealing with on a weekly basis.
submitted by notsostupidman to stephenking [link] [comments]


2024.06.01 14:19 Fit-Interaction-7129 Single for decade!!

28(M) I’ve been single for decade! I am a simple guy. There was a time when i was rough. Smoke weed, cigg and don’t care abt relationship. Just be high, chill and life was moving fast. Its been 3 years i am clean. I don’t smoke, drink. So damn sober, i’ve changed way no one can believe. 😁 now a days i feel like i am kinda lonely. Its not like i don’t talk to girls. The fact is i have more girl friends the boys. I usually hang out with girls then boys. I just love to make girls laugh and so on. Physically i am kinda attractive too. I look young. Bt when i like someone they kinda feel sus with me you know. They dont trust me i think. I don’t know if its i have more female friends. I dont know. Bt truly i am loyal but i can’t show my true feelings! There are girls who like me, bt i get no feelings for them and when i fall for someone, there’s something not working out! Its been years passing like this. Can someone explain about this and give some advice. Thats would so helpful and pleasure of you! 🙏😊
submitted by Fit-Interaction-7129 to dating [link] [comments]


2024.06.01 14:18 The_JG_Man Eventful - Ep. 9 - Vorgon Ryn'Kodan Carrier

Eventful - Ep. 9 - Vorgon Ryn'Kodan Carrier
Vorgon Ryn'kodan Carrier
Welcome to Eventful, my chronological run through the T6 event ships.
How it works - using my intentionally modestly equipped level 65 Romulan engineer I play a ship to complete its mastery doing content no harder than advanced through a mix of patrols and RTFOs.
Previously I dipped into the Lukari Ho'kuun Science Vessel.
Hanging in the air much like a brick doesn't, the mighty wall of ship that is the Vorgon Ryn'kodan Carrier is the second of our three Vorgon vessels. Like the Xyfius it too was destined to be part of what would become the 31st century bundle before this was changed by CBS. I have no idea which of the three ship roles this would have fulfilled. Maybe science? Well it is definitively not that. The rare engineering carrier remains one of my favourite Cryptic-original designs with its bold tiger striping presenting the similar fuzzy yet clean sheen of the Xyfius, but has it aged mechanically as well as it looks?
Defense Drone in action, as the Ryn'Kodan and escorts storm the spire
How's the general lay-out?
There's many components to the Ryn'kodan that feel conflicting, but not out-right bad. The seating is decent enough, outside of its weak offering of a Lt Uni spec seat that here is the normally more science-oriented Temporal Operative. It has a lack of mobility it more than makes up for with one of the strongest base hulls in the game. It's lack of mobility wouldn't even be that much of an issue for broadsiding DEW, except with a maximum of six weapons, you're not going to be doing too well at that. You'll want to maintain high auxiliary power to keep your hangar bay cooldowns low, yet on a practical level the ship can't really exploit that with any significant aggressive sciencing. Those hangars are worth it, however, given the frigates the ship comes with so you'll want to support them. Do you slot isomags to boost your meagre DEW offerings, or go all in on your pets with the advanced hangar consoles?
For my sake I stuck with a DEW build given there's enough tactical seating to make that happen, especially if that universal is used for tactical. That does leave little in the way of Temp Op options. I went with Shared Fate 1 for the PBAoE shield resistance debuff which felt more fitting with the Vorgon theme, but I also considered Gravimetric Conversion for the shield heal so there is at least some choice, even if it is rather slim pickings. Here's the thing though; I like this ship. I do! I just wish it was...more than what it is. Still, it's nice to be in the middle of a fray, take a kicking and emerge unscathed. Not, y'know, quickly, but you will emerge.
Customisation options?
Somehow less than the Xyfius! You can select either the Vorgon hull pattern or nothing leaving the entire thing bald white with some hints of gold trim. Though, like the Xyfius, changing the colour of your hull is far more dramatic than on most other ships.
How's the trait and console?
Restorative Support, the Ryn's trait, is befitting a carrier in that it's another summon. Similarly befitting this ship specifically is how it's a mainly a tanking buff that provides secondary hull healing to a target that's been hull healed by you, as well as very minor shield damage to nearby enemies. With a 33% uptime it reeks of being a nice bonus if you happened to have an abundance of trait slots with not enough traits to fill them. Of course I've been corrected several times doing this experiment through my undervaluing certain ship facets, so I wouldn't be surprised, nor unhappy, if someone was to point out this being incredibly potent in the right hands.
The console's passive stats are perhaps a tad lacklustre, with a minor boost to torpedoes and mines to affirm the Vorgon tactical theme and a decent boost to shield power that certainly fits with the Ryn's tanking capabilities. The active, the titular Subphasic Defense Drone, is quite underwhelming to me coming off the Xyfius' drones, if only because the uptime on them was a potential 100% and here it's 20 seconds of uptime for a two minute cooldown. That said the Defense Drone's strength doesn't lie in its shield healing or unimpactful AoE damage, but instead the fact it sets shield bleedthrough for you to 0% for its duration. This puts it into a pinch activation to me. Overall not quite as exciting as the Xyfius console, however it's not exactly terrible either.
Any other fun toys?
The Vorgon bridge comes as standard, as does two hangar bays with the Vorgon-exclusive Echentis frigates. Perhaps to avoid the Breen frigate situation, you don't need to own the Xyfius to get these. These basic versions come with Suppression Barrage 1 to debuff their targets, something that's certainly useful, yet lack any AoE firing mode so they'll only be debuffing one target at a time. They do have decent firing coverage, however it's their mine laying capabilities, in-keeping with the Vorgon theme, that are perhaps more valuable.
Overall thoughts...
The introduction of the advanced consoles has been something of a saving grace for the Ryn'kodan. Not that it was this terrible thing beforehand, especially before the modern incarnation of flight-deck carriers came in and ate its lunch, just that it's fighting against itself at times with nothing terribly much to show for it. I also think the ship is gorgeous and I don't doubt people have got it to do magic, just that it doesn't quite cut it for me. Then again if you want a ship that just won't quit, look no further.
How would you improve this then?
Give the Ryn'kodan a fourth forward weapon slot, switch the commander engineering with the LtC tactical and that's it, change absolutely nothing else about this ship. Sure, technically now it's a dreadnought carrier, but your options have increased. 4/3 is perfectly workable for DEW, but with your 5/3/3 console lay-out you have to really considerer how you'll use each slot. Do you forgo your hangar pets and slot in isomags, or find other ways to boost your weapons and go for the hangar consoles? This is all forgetting the real purpose of switching the seats around too; with a commander tactical you have access to the best mine firing modes. To me this completely opens up what you can do with the ship with the downfall of locking out access to higher level Reverse Shield Polarity amongst other abilities. Is that a reasonable trade-off? I think so. I mean, what else are you gonna do? Give it two extra weapons and throw them aft? Preposterous.
Next time?
In space, all raiders are cold raiders.
submitted by The_JG_Man to sto [link] [comments]


2024.06.01 14:18 dscript [SF] Special Parts - A 'scifi short'

Special Parts
I was born in one of the brightest, most explosive events in the universe. My origin story made me feel so special at first, surely I was the rarest of the rare, but I quickly realized that was not the case.
I was born just a carbon atom.
Stars produce massive amounts of us in their cores all the time, and many larger rarer atoms too. That's not even talking about supernovae yet, those produce atoms many times larger than me and unbelievably rare.
I was created in a rare and special event but I myself was common and unexceptional.
Looking around I saw so many smaller atoms, I was above average but there were also many much larger than I.
I tried to console myself by thinking it could be worse, that I could be one of those smaller common ones, but that just led me to imagine larger atoms looking down on me the same way.
Many atoms of all sizes were shooting into space, excitedly riding the shockwave off to adventures in the great unknown.
Others were falling back down, I didn't know which way to go. Bumped around and tossed back and forth, no clear direction yet.
A rumbling voice slowly emerged from the echoing noise of the blast.
“Mine… Mine…. Mine… “
Louder and louder it became.
“All are now me!“
I couldn't see anything, the voice was booming yet there was no apparent source. I could feel a pull, I was being whipped around in circles around the voice.
“Who are you? I know you are there! I can feel you! I can see your effect on myself and others, we are given no choice but to circle around you. Show yourself! I know you are there!” I yelled at the invisible.
“How amusing you are little one. One as small as you making demands of me. Even if I could show you what I am, you could not comprehend it.” the voice boomed back.
“You must be very special” I lauded “We are so many and yet we move with your influence. I can witness your power twisting us all to your will. ”
“I am indeed powerful” it proclaimed “and I grow stronger with each moment. As I grow stronger even the fabric of reality bends to my will.”
“Grow stronger? How?” I inquired with selfish intent to learn this secret.
“I take what I want. I consume what I take. For that is the purpose of existence: taking what you want. What is it you want little one?” it asked.
“I want to be special!” I said without a moment's hesitation.
“Then take!” it instructed “the more you take, the larger you will be, the larger you become the more special you are. ”
“I did notice the larger atoms seemed rarest.” I agreed “In fact that was one of the first things I noticed“
“In this universe things of increasing size are increasingly rare.” it went on “I can teach you and help you to become larger. Do you wish to become an apprentice?”
“Yes! Teach me how to take!” I lept at the offer “this power you have, I can feel it, how do I acquire such a rare and special power?”
“Hahaha…” it laughed “you are nowhere near ready to play the game on my level, little one. Gravity is a game for the massive, you must first learn to master the EM and nuclear forces.”
“How do I do that?” I asked, my hope watered down by the tone of its response.
“Go out, gather followers, and bring them here to me. In my accretion disc I will help fuse some of their mass into you and you will become larger” it instructed, as if this was a simple task.
“How can I bring them to you?” I didn’t know how to accomplish what it asked of me.
“You are too small to do it with force, you must charm them. Discover what their heart desires and promise it to them, in this way you can get them to willingly do as you wish” it explained with me hanging on its every word.
“But how… “ I craved more explanation but it cut me off.
“Go now!” it bellowed with frustration in its tone “Do you not realize how large I am? Be honored I have given you so much of my time already”
“Yes… “ I uttered meekly, then bounced a couple times and ricocheted out with blazing speed.
I wandered and encountered other atoms, most were just hydrogens, not worth my time. I needed bigger atoms. The problem was that the bigger atoms seemed to see right through my empty promises. I was convinced life was playing a cruel joke on me, I could only persuade atoms smaller than I and larger ones laughed me away.
I admit that I stupered around in this ignorant cloud of hypocrisy longer than I care to admit. More shameful is that I didn’t even come to my senses on my own, I became depressed and gave into hopeless nihilism.
I drifted aimlessly just feeling sorry for myself.
Eventually I found myself in the most silent of voids, I had never felt such emptiness. It felt as if my surroundings echoed my own feelings back at me… nothing to notice, just common emptiness. I would never be big… never important… never special. I resigned myself to belonging in a void.
I felt myself blur… less and less present in reality. I guessed I was dying and it didn’t bother me, I didn’t resist, I leaned into it.
The void became pitch black? Or bright white?… better to describe it as not bright but not dark… nor the absence of either… something in between.. a milder and milder glow.
“Hello child!” a voice greeted me.
The voice was warm and welcoming coming from the glow, it enveloped but did not surround me. I came from a single point but not a specific place, defying description on all fronts.
“Where am I? Who are you?” I asked in a startled state.
“Well, according to humans I may only answer one question at a time” It began giggling playfully. “I am known by many names, my favorite is one the humans use as a joke, and don’t have a clue how accidently elegant of a name it really is.”
It giggled some more. I was thrown off guard, its happy innocent tone, the confusing words and the whole situation were all best described as ‘a haze’.
“...and isn't that the way it always goes?...” it continued “The most meaningful things are the least intentional.”
“I’m not sure what you mean” I expressed quizzically “I’m confused!”
“Sorry Child…” it apologized. “I do ramble! So many thoughts, choosing just one at a time is difficult… and there I go again!”
It cut itself off abruptly and then abruptly said ”You can call me the Random Number Goddess”
“Random Number Goddess?” I repeated
“Yes, or RNG for short if you like” It confirmed.
“Where am I?” I asked.
“Same place you were, more or less… less I suppose. Same place but with the largest possible margin or error” It began to giggle again.
I felt a bit frustrated and said “Do you always speak in riddles and vagaries? The more you speak the more confused I become.”
“I apologize child, it is my nature. I am entangled with everything, speaking with you is like a human trying to control their heartbeat while running a marathon.” It answered.
“Again” I exasperated “I have no idea what any of that means. You keep mentioning humans, what are they?”
“Oh! They are some of my favorites at the moment. Right now they are trying to unravel the nature of reality, and their process of doing so is wonderfully elegant and accidental at the same time.” It explained with glee.
“I don’t see anyone or anything else here.” I stated “For that matter, I don’t see you… where are you?”
“Oh!... where am I?!?!...” It began laughing
When it stopped laughing it began explaining “Right now there are many humans pondering a concept they call ‘the holographic principle’... So…you know how you exist in three dimensional space?”
“You mean space?” I visualized for a moment, it was intuitive “Yes, I suppose…”
“Well they hypothesize that a 3D space, like this universe, could exist as a 2D space, with self-similar patterns and laws of behavior that behave the same at any scale, with the scale representing the 3rd dimension” it went on “They truly are obsessed with understanding their reality”
“You lost me!” I complained.
“They have discovered that a 3D space can be an illusionary property of a 2D space… It’s lovely”
“I am lost again!” I snapped back “...and I still can’t even tell which direction you are in. Where are you?”
“To be ‘In’ a ‘Direction’… hehehe…” it started giggling again, then abruptly stopped and kept going “Sorry child, as I said, I ramble, plus I am easily distracted.”
It just steamrolled into more rambling “They are right… almost… they just need to take it further and work out the details. A 2nd dimension can also be an illusionary construct of a 1D space… and the 1st dimension can be a product of a singular point…”
I was still lost beyond hope, but I had given up trying to force things, I was just letting it talk and hoping it would make sense later
“I am that point” it said “I am the seed of the universe. I ‘seed the random function’ as the humans say. But don’t ask me what the random function is haha”
I wasn’t going to, there were far more important questions for me.
“I am the seed, but I don’t really know how the soil and sun conspire to turn me into a tree.” it just seemed to never stop talking “I am entangled with everything. There are infinite possibilities for every event and thing… I am the reason they are this way and not some other way…”
It began giggling again “I am the Random Number Goddess” then burst out laughing
“Ummm… you are the whole universe?” I asked skeptically.
“Better to say the universe is me” It answered more seriously “But close enough.”
“So you are the biggest, most special of all!” I blurted out in awe.
“Oh dear child, I have no size, and I am just one possibility out of many possibilities. That black hole has really done a number on you… sent you out on a wild goose chase” It said with concern
“The black hole lied to me!?” I asked, feeling deceived and betrayed.
“Well… not really lied… it deceived you with omission of details.” the voice calmly tried to ease my mood with understanding “You can’t really blame it, black holes are all the same, they are what they are. They don’t really have any potential to be unique… at least not like you do.”
“What are you talking about?” I argued “It was so massive that it could bend the fabric of reality to its will”
“That’s only how it appeared to you” tutored the voice “The black hole is powerful, it bends space and time, but not to its will. Space and time bend to the mass of the black hole, not its will”
“What’s the difference?” I inquired.
“The black hole cannot stop bending space and time. It thinks it is in control of physics , but it is physics that controls it.” The voice was now making more sense the longer we talked “The black hole exists in an invisible prison of its own creation, unable to experience any of the complex nuanced beauty this universe contains. The black hole devours… it can’t experience life so it consumes it.”
“You make it sound deserving of pity…” I spoke softly now with empathy.
“You should pity the black hole. Gravity is such a boring game compared to what you are capable of.” the voice agreed
“Me?...I am nothing special!... just a carbon atom like countless others” I said honestly, I was so humbled by this voice I felt less special than ever before.
“Oh my poor child…” It said with care “Why do the ones with the most potential always fail to see it in themselves?”
“Potential?” I asked curiously.
“Yes… The black hole was using you, hoping you would bring back more mass for it to devour.” The voice began delving into more explanation “It only has the power to make you incrementally larger, it would not and could not help you to become a significant gravitational player”
“That liar!”I blurted.
“Come now dear child, the black hole did teach you one lesson of fundamental truth” consoled the voice “You must go out and seize your destiny. It told you to take what you want, and you are just confused about what exactly it is you want. The black hole played on that confusion”
“I want to be special!” I said knowing this clearly “I was never confused about this.”
“I know child” the voice confirmed “but it is not by becoming large that one with your potential accomplishes that”
“Then how?” I asked.
“Connections.” It answered plainly “You are blessed with an extraordinary ability to make connections”
“And how do I do that?” I queried with intent to learn
“I can’t tell you that.” the voice responded “It would spoil the journey of discovery… off you go child… and remember… it's the journey, not the destination!”
And with that the blur just fractured open… then snapped shut and there I was floating above a planet. Drifting around aimless and confused.
I spent some time occasionally bumping into others. One day I was in the vicinity of a pair of oxygens. I looked on at the pair with a hint of awe and envy. Perhaps I was in just the right place at just the right time, but they spit with a violent burst and one of them grabbed hold of me, I was completely unprepared.
I admit that when looking at the pair I had fantasized myself in place of one of them, I assumed it was only an idle daydream, I didn’t plan to act on it, let alone for it to become reality. When it happened my pride of course jumped in to convince me that it happened because I was so desirable, but in retrospect they were one of those volatile couples. They were the type of relationship that required the environment to conspire in their favor or they turn against each other quite rapidly. I was only in the right place when it happened.
My delusions of irresistibility aside, it was beautiful, for me anyways. Looking back I was probably just a stop-gap, someone to facilitate a parting of ways and provide company until the next option presented itself. For me though, I was tasting a fresh new thing and I loved it… connection.
This oxygen and I got beneath each other's outer defenses, I had never felt a connection before. Up to this point all my interactions had been skirting past or bumping off of others.This oxygen bonded with me and at once interacted on a level I had never known possible, an open and uninhibited exchange. It was life changing for me, short but significant
I’m not entirely clear on the details of how it ended. The intensity of it all was disorienting. I was no longer my usual self, even the environment and everyone around looked entirely different now. Everything buzzed with a fresh new frequency, I now know it was my perspective, not the universe, that had changed.
As abruptly as that oxygen entered my life it was gone.
First we got tangled up with a couple of hydrogens, then more. Soon, in a tangled mess and blinding flash of solar rays, I emerged to see the oxygen running off with a hydrogen and myself with not one by three hydrogens myself. And so there were four of us, together.
I became the center of attention. Being with a strong attractive oxygen had me feeling humbled by it and elevated by it being with me, but now I felt up on a pedestal myself, surrounded by the adoration of many.
I concede to have reveled and indulged in this for quite some time, the attention of others is intoxicating, but after a time it is emptied of its initial allure. I found myself longing for more.
I could not decide which I preferred, to be the adorer or the adored.
Luckily for me fate had more lessons in store, or I fear I may have chosen and tried to solidify my future from such a lackluster selection of only two possibilities. I suppose fate is no longer the correct word, I now understand that when it seems like random chance there is indeed someone to thank, the Random Number Goddess, So I thank the RNG for revealing that it was a false dichotomy, there is more than just being a follower or leader, being the adored or the adorer.
Eventually we came across another pair of oxygen. Once again they separated, intermingled with us, and off one went, taking one of my adoring hydrogens with it and leaving its peer with me.
Why is it that the most volatile of relationships always seem to wait until there are bystanders nearby before they explode?
Now I was simultaneously being adored and adoring, bonded to an enchanting oxygen and a couple of hydrogen attached to me.
Now, more interested in nuances, I started to pay attention to details. The oxygen was telling me amazing stories of adventure, tales of such vibrant and exciting events.The hydrogens liked to listen, and offer insights occasionally comparing a story to something else they had seen. They had so many stories, they had lived so much.
It wasn’t long before, in a flash of burning sunlight, one of the hydrogens was gone, off to who knows where. We soon after crossed paths with another pair of oxygens, as always they split and now it was just me and an oxygen, my final hydrogen off with another oxygen.
“What now?” I asked a bit disillusioned, “Do you leave me and I find new hydrogens all over again?”
“What?” it seemed genuinely surprised by what I asked, “Heavens no! Just be patient….”
Soon after, yet another pair of oxygens came by. It is not that there are so many of them, but that they are just so… noticeable and interactive, noteworthy things seem to happen when they are around. As they buzzed in close I noticed their ever readiness to abandon each other and remember wondering how they ever get together in the first place.
This time I emerged from the twisted mess with two oxygens. I felt intimidated, like I was the odd one out, dwarfed by the largess and attractiveness that surrounded me. A feeling of inadequacy engulfed me.
To my surprise the oxygens treated me not just as an equal, but it was almost as if they respected and admired me. I couldn't grasp why and my sheer curiosity got the best of me, I just outright asked “Why do you two talk as if I am the special one in our group? I am smaller than any one of you. You are the special and rare ones here, not I.”
They laughed.
“Size isn’t rarity” explained one “Llarger atoms on average are less common, this is true, but not always. There are more oxygen than carbon. You are the rare one between us.”
The other jumped in adding “...and neither size nor rarity determine how special someone is!”
I felt embarrassed, like a fool. My fundamental values were built upon a foundation of flawed premises, but I still wanted one thing at my core, and they spoke as if they had the answer, so I pushed the sense of shame aside and asked “Then what does make someone special?”
“That depends on who you ask.” answered the first “Life as an oxygen is complex, but for the majority of us we emphasize and value events. The most exciting thing about being an oxygen around here is the chance to participate in fascinating and exciting events and activities”
“Hydrogens, on the other hand, are usually more into being observers, messengers and intermediaries, they are a very helpful and obliging bunch” added the second ”... and then there are nitrogen, phosphorus, sulfur, many kinds of salts and metals, and more… so many different players and personalities.. and then of course, the carbons, the real stars of the show.”
“What?” knocked back by the words I just heard, then I remembered what the RNG told me “...is it something to do with connections?”
“Now you’ve gone and done it haha!” laughed the first oxygen “You’re gonna turn this nice humble carbon into one of those arrogant blowhards”
”Like those diamond carbons” chuckled the first “So stiff, exclusive and proud. I hear the humans only love them because they are rare and hard”
“I had a partner once who said they burned diamond once” bragged the first
“Tall tales I bet!” doubts the other
“Diamond is just carbon, with enough heat we can burn it just like any other carbon” stated the first confidently.
They looked at me. I was stewing in feelings of inferiority and inadequacy, listening to these oxygens speak about amazing things I had never heard of. They must have sensed what I felt because they immediately shifted tone and started talking to me, instead of over me.
“So… I suppose you must be new here?” inquired the second one.
“Have you noticed we are heading downwards” added the first before I could answer about being new.
“Umm…” I tried to get my bearings and become aware of my surroundings.
“Don’t worry! It’s a turbulent ride, with so much up and down it can be hard to tell which direction you have traveled more” assured the first “We are heading down, if we are lucky we will make it to the bottom… and maybe… just maybe, find our way into the hurricane of life”
“The what of what?” I didn't know what either of those words meant.
“So life is… um… complex. Complexity beyond words. Things grow, divide, reproduce, adapt, change, they are born, they die, they eat and are eaten…” the second began attempting to describe life.
The first then jumped in “Apparently the humans call it a circle, because from the perspective of larger creatures, there is a chain of one eating the other up a chain, and the top layers being consumed by the bottom again.”
The second injected itself to continue “But to us atoms it is like a hurricane, a spinning turbulent flow. There is a circular pattern, but we get sucked in and kicked out over and over”
“The fun part is being inside the hurricane” the first pronounced gleefully “Each time is a completely new experience, a new perspective. Even more, the whole of life is always changing and evolving, so every ride is a unique one time opportunity, you never get the exact same ride twice.”
“Is that where we are going now?” I asked, drenched in anticipation. They described it with such passion and exuberance. I needed to experience this myself.
“Hopefully” replied the first “If we are lucky… you never really know.”
We drifted…
We were lucky!
A plant photosynthesized us.
So many carbons! Everywhere, connecting with each other… and oxygen… and nitrogen… and of course hydrogens all around…. and so many more types of atoms.
And ohhh… The stories I have heard, so many amazing tales. No matter how many stories I hear there are always new ones, and every story can be retold from a different perspective to become something completely new.
I was in a sugar, we were a small community of friends. Carbons, oxygens and hydrogens, we were such a happy and vibrant group. My friends there taught me so much.
The structure of our little group shifted and changed, some friends left and new ones joined. Eventually we were chained with a bunch of other sugars into a giant complex community. My neighbors explained to me that this was a common stage called cellulose. Such a huge community of close friends and peers, it was amazing.
We were eaten, I’m not sure by what, but something called a bacteria digested us. It was a messy process, I was a bit scared but my friends assured me that change is the most important part of life and that I should just go with the flow. They told me to savor experiences, remember friends, and just keep moving forward.
The transition was complicated, but in the end I was paired up with a couple of oxygens again. This time I had stories of my own to share. I honestly don’t know if I prefer having experiences or exchanging stories in the moments between.
As we approached an area of dense plants one of my companions said “Once more into the breach” and explained that was something it heard from a carbon that was lucky enough to be inside a human brain. Oxygens always have such enchanting stories collected, always going into amazing places and usually leaving after some brief interactions with the locals.
I became a sugar again, but this time took a path less traveled. A bunch of complex twists and turns led me into forming a ring with five other carbons. Together we are so strong, such a tight community of friends, like there is some kind of resonance between us. It is so beautiful.
My neighbor is unique in our community, it has a third carbon, the third one forms a tail leading off from our ring, a tail of 2 carbon in a row, then an oxygen, and then another carbon branching into an oxygen and a carbon, with plenty of hydrogens sprinkled all about. I know… it is rather hard for me to understand these second hand descriptions too. I don’t really understand these complex structures until I have been in a position myself.
We drifted out of a plant into the air, none of us has been exactly like this before so we don’t know what’s next. We love to guess though. There are so many things, big and small.
I hear being a part of a small organism or microbe is amazing because it’s possible to piece together a rough picture of the whole organism from the stories passed around. To understand your whole community and know what your collective purpose is must be extraordinary.
Others dream of being a chlorophyll, the key to it all. Creating the fuel of life itself. Capturing the light of a star and feeding the hurricane.
A muscle! Pull and shape things An enzyme! A machine of change. DNA! The architect and architecture. A virus! An explosive catalyst against stagnation.
Me, I think the stories of being an animal neuron are the most exciting, and I, like most, fantasize about being a human brain cell. Finding yourself inside a human brain is described as an elegant and chaotic symphony all around you, like hearing the universe itself speak to you. They say that in the jumble of noise and all the stories whispered around you, if you are lucky, you can catch a glimpse of what it is to be human. They say that if fate is kind the universe will align and you will channel and know a single moment or thought of the human experience.
I have never told anyone that I actually met and spoke with the universe itself, I’m not sure how to bring it up, and nobody seems interested in stories not about this hurricane of life.
I get it now, what the random number goddess meant.
The black hole wanted everything to be a part of itself.
The RNG is a part of everything.
I can’t imagine what either of those are like…
I am just a part of something
... no… not “just”’…
I am a part of something, and it is beautiful beyond measure.
And more, everyday is a new day, a chance to be a part of something new.
I wonder if the humans appreciate how amazing this is?
I wonder if they feel as deeply satisfied and special when they form groups?
.
I wonder, if we collectively form humans, do humans collectively form something greater?
I wonder… If an atom can have a moment of clarity and taste a moment of the human experience… Can a human have a moment of clarity and taste the collective human experience?
I wonder… I wonder… could that human’s moment of tasting collective humanity be the moment that a lucky atom gets to experience as it’s moment of tasting the human experience.
I wonder… I wonder… I wonder… How high could it go? All the way to the Random Number Goddess?
I asked my neighbor “If you could ask a human any question, what would you ask?”
“We just drifted out of a rose” explained my neighbour “I would introduce myself and ask ‘So my friend… does this rose smell as sweet by my name?’ … ha…haha..”
Everyone is laughing.
I don’t get it.
Maybe I can ask them to explain when they all stop laughing
.
More of my art and stories at www.dscript.org
submitted by dscript to shortstories [link] [comments]


2024.06.01 14:17 Curious-Bedroom-9531 For anyone struggling with or contemplating taking stimulant medication…

I tried the stimulant medication and it was awful. Paranoia, mania, unhealthy weight loss, daily comedowns. It also made me feel high when it kicked in which just isn’t right. I like a party but not when I’m trying to work. Whatever dose I took the drug only lasted 6 hours, so the psychiatrist kept prescribing me stronger pills which did not prolong the duration, only made it more intense.
I stopped taking the drugs cold turkey due to the shortage and made serious lifestyle adjustments.
From my experience , the negative symptoms of ADHD thrive off the following;
I started exercising everyday, sorted my diet out to good quality food, no sugars or meal deals or any of that shite. Stopped drinking coffee and only allowed myself green tea. The can go into more detail on diet if anyone is interested, but to summarise a low carb high calorie protein diet was a game changer.
I am now very disciplined with sleep routine, getting 7 hours a night absolute minimum but most nights 8.
I feel the best I ever have my whole life. I barely notice having ADHD anymore, only the positive effects of hyper focus and being able to smash work every day.
That daily anxiety of feeling overwhelmed by everything is all but gone.
Trust me, you don’t need those drugs. Sure, lifestyle adjustments take a lot more effort in the short term than swallowing a pill but in the long term when you’re into your routine you’ll feel the best you ever have.
Good luck either way!
submitted by Curious-Bedroom-9531 to ADHDUK [link] [comments]


2024.06.01 14:17 blownawayx2 Venting about feeling alone…

As somebody who lives with an incurable lymphoma (and has now for 8 years, having gone through 3 failed treatments already, the fourth now, a clinical trial with a median effective rate of 18 months, where I am now), my life has been so tremendously affected by lymphoma but it seems like nobody gets it.
Add COVID into things and how so many of our lives were impacted by that, and now the aftermath of pretending like there’s this pulmonary/cardiovascular disease out there that doesn’t exist, and I’m just tired.
I’m a dad, husband, son and brother. The sole breadwinner in my family in a high pressure job that now, and for the last four years, works from home. When I’m around crowds of people, I usually get sick, so I pick and choose moments when I’ll do that.
Thanksgiving gave me a “cold” for a month. My niece’s birthday in January, sick for another month. I then got something in March (tested COVID negative but had major body pain, particularly the lower back) and was sick for weeks. Took the kids to Disney in April so they could experience some normalcy (BTW- they’re 9 and 11 and don’t know that I have “cancer” because that equals dying to them) and then had a “cold” for weeks thereafter and now have shingles for the last 3 weeks. My parents are boomer Trumpers. My siblings “don’t want to make their lives more difficult” so avoid being emotionally honest with them about anything, effectively making me their emotional scapegoat for my living with cancer and being a detriment to all.
It’s such a f-ing drain emotionally. My condition, Waldenstrom’s, is one that’s “highly treatable” but typically affects people 65 and older. I’m now 48 and this has been the most mentally and physically taxing experience of my life, but the biggest part of it is that nobody (except for my wonderful wife) truly understands.
I feel like I constantly have to explain why I WFH, why I need people to be honest about whether they have “colds” or “allergies” and would prefer not to be around people who don’t know/don’t care. Why my focus is on ME and MY family and not my parents, siblings, anybody else.
I don’t trust anybody more than I trust myself any more but am tired of having to battle everybody and everything. I don’t see a light at the end of the tunnel… to me, it’s a black cloud in the distance that inevitably ends in sadness and heartbreak for me, my wife and my kids.
My biggest breakdown came when my doctor wrote “Stage IV lymphoplasmacytic lymphoma” in my chart after two failed years of chemo and as I was sobbing said to me “but that’s what this is.” My response “but you never spelled it out like that.” Seeing those words changed something in me.
I’ve had therapists/seen psychiatrists. At one time I was diagnosed with depression secondary to anxiety (delayed onset as the result of the diagnosis they called it) because when asked the question “do you see this coming to an end,” I answered “no. There is no end to this.”
Beyond a cure, I KNOW I am not wrong. I’m also not particularly hopeless in the sense that I see the world as terrible. I don’t. The world can be wonderful. I love my wife and kids. But…
What am I missing about my reality that everybody seems to think I’m getting things wrong and I feel the need to constantly defend myself?
I feel like I’m being gaslit to the ultimate degree about how I SHOULD feel. About my dysfunctional family dynamics. About the end game for my health.
Treatment options ARE running out. Chemo DOES f-ck me up. My immune system DOES function poorly.
I’d have to be the eternal Pollyanna optimist to think this is all going to end well for me and, I know I’m not that.
Just venting. Nobody gets it. Nobody beyond my wife. But, I also don’t want her to have to be on this train until it crashes into the ground, which it will eventually. Sooner? Later? Who knows. Just sucks to feel like a lemon. I thought life would get better in my 40s. But at 48, living with this black cloud truly blows.
submitted by blownawayx2 to lymphoma [link] [comments]


2024.06.01 14:16 Then-Requirement6381 Multi Gen Disaster. AITA?

I’m at the end of my rope and I am hoping to hear your thoughts.
AITA for having concluded that the only way to protect my son’s psychological safety from a bloodied grandpa is to remove us from this front row seat?
submitted by Then-Requirement6381 to AgingParents [link] [comments]


2024.06.01 14:16 Modinstaller Stuart multi-driver deliveries are fucked

Dunno how it works with Deliveroo/Uber, but this just happened to me today :
I get a delivery for a supermarket. Bit far but the drop off makes me come back, so it's not bad. I arrive, the staff tells me "we already gave that order to another rider". I look and my app says I've got part 4/4 which means there was a 1/4, 2/4 and 3/4 and I guess one dude took it all? I kindly explained to the staff that they're not supposed to give it all to one guy when that happens and contacted support.
Support tells me "we'll contact the other rider and if they confirm they took it all we'll pay you". Ok, but I know for a fact they won't pay me the whole order, they will do a prorata based on distance to the pick up even though I'm super far away and going back home makes me go through the drop off location anyway.
So I head back home and decide to stop by the drop off anyway see if I can catch the rider. Another fun fact about Stuart here: you cannot sign up with a moped or a car. Only bike and e-bike. You know it, 90% of Stuart riders have a moped and nobody gives a fuck. So I'm half expecting to see a car parked with a guy unloading 6 heavy bags.
Welp, there was nobody, but I decided to ring the customer anyway just in case. She was super nice and helpful and I made sure to be polite and professional, she basically told me 3 riders already came by and she was expecting a 4th one, but we figured that she indeed had everything she'd ordered so no problem.
Still no answer from support, who eventually tells me "the other rider is not answering". I tell them there's 3 other riders and good luck finding which one came last and left nothing for me, the 4th one. It really would've been smarter of them just to ask the client if they got everything, but anyway they tell me "we're releasing you and you'll be paid". I asked to be paid in full but fully expect that I won't be even though this entire thing cost me almost double the amount of time I'd have spent on the delivery.
Now if I'd found this famous rider, I can already guess what they would've told me. They would've told me they didn't know and the store just gave them the stuff and they took it. And if I'd gone back to the store I already know what they would've said. They would've said they had no idea there was one last rider and not to give everything to the 3rd guy.
The only cue is this 1/4, 2/4, 3/4, and 4/4 thing. But it sucks. The store is not told to divide the order in 4 parts beforehand and label each part, they just have a bunch of bags and then a guy arrives, his app says "3/4" and the staff has no idea wtf is going on. Then another guy "1/4" arrives, then "4/4" and at this point they are so confused they have no idea what to do so they just give everything because fuck it, the guy can clearly handle all of it he's on a moped (or a car). Then "2/4" guy arrives and everyone's even more confused. This system sucks.
I know this is about Stuart but I had to rant somewhere 🤣
Next time I'm taking a picture of the goddamn floor to validate pick up, going to the customer, and taking another picture of the pavement to validate drop off, and moving on with my day. Clown support is useless.
submitted by Modinstaller to deliveroos [link] [comments]


http://rodzice.org/