Image that aren't against TOS getting blocked/removed on upload
As the title says. I lately have been having huge problems with uploading perfectly fine images that aren't breaking the TOS. Did something change or do some prompts just trigger the detection system? When I have been uploading images created in ComfyUI I did never encounter this Problem?!
One of the images in question has the following generation data:
(high quality, best quality, detailed skin, intricate detail, detailed texture, masterpiece:1.2), 1girl, woman, solo, (freezing summer evening:1.1), (modern age:1.2), (fantasy world, feline :1.2), (young adult gravekeeper:1.2), (shoulder long curly hair:1.1), (crazy expression:1.1), (studio lighting:1.3), (muscular body:1.25), (beside unreal police station:1.2), macabre hot dog, Negative prompt: (low quality, worst quality, extra digits:1.3), (watermark, signature, branding, logo:1.1), (kid, child, loli, underage:1.2), (giant breast, huge breast, wide hip, thin waist, thick thighs, obese, big ass, fat:1), ERA09NEGV2, epiCNegative, boring_e621_v4, lr, Concordant-neg, nsfwEM, nude, Steps: 30, Seed: 2153832918, Sampler: Euler a, CFG scale: 8, Size: 512x768, Parser: Full parser, Model: realDream_8, Model hash: 2ab7fa3211, VAE: vaeFtMse840000Ema_v100, Clip skip: 2, Backend: Original, App: SD.Next, Version: 39085ab, Operations: txt2img; hires, Second pass: True, Hires force: True, Hires steps: 15, Hires upscaler: RealESRGAN 4x+, Hires upscale: 2, Hires resize: 0x0, Hires size: 1024x1536, Denoising strength: 0.4, Latent sampler: Euler a, Image CFG scale: 6, CFG rescale: 0.7, Sampler brownian: False, ADetailer model: mediapipe_face_mesh, ADetailer confidence: 0.3, ADetailer dilate erode: 4, ADetailer mask blur: 4, ADetailer denoising strength: 0.45, ADetailer inpaint only masked: True, ADetailer inpaint padding: 32, ADetailer use separate steps: True, ADetailer steps: 30, ADetailer version: 23.11.0, Wildcard prompt: "(high quality, best quality, detailed skin, intricate detail, detailed texture, masterpiece:1.2), __gender__, (__temp__ __season__ __timeofday__:1.1), (__ages__:1.2), (__ethn__ :1.2), (__age__ __prof__:1.2), (__hairlength__ __hairstyle__ hair:1.1), (__mood__ expression:1.1), (__light__:1.3), (__bodytype__ body:1.25), (__locpos__ __adj__ __loc__:1.2), __adj__ __obj__, "
This is the resulting image: https://i.imgur.com/pr9UxH4.jpeg
My assumption is that it is being flagged because no age of 18 or older is specified in the prompt. Even when it is not nudity, it seems to flag anything that could remotely be considered adult in any way unless you specify that age in the prompt.
Yes ... this is getting really annoying lately ... images are getting flagged for no reason ... even portraits with only the face depicted ... and I don´t think it´s the prompt, because when I upload a stack of lets say 5 images, all with the same prompt and generation data ... only 2 get banned and 3 are published ... hopefully this will be fixed soon ...............