Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ZeroShotClassificationPipeline has large memory spikes when using a lot of candidate_labels #24873

Closed
2 of 4 tasks
rsmith49 opened this issue Jul 18, 2023 · 8 comments · Fixed by #24893
Closed
2 of 4 tasks
Assignees

Comments

@rsmith49
Copy link

rsmith49 commented Jul 18, 2023

System Info

- `transformers` version: 4.27.4
- Platform: Linux-5.4.0-1103-aws-x86_64-with-glibc2.27
- Python version: 3.8.0
- Huggingface_hub version: 0.16.4
- PyTorch version (GPU?): 1.8.1+cu102 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: YES
- Using distributed or parallel set-up in script?: NO

Who can help?

@Narsil

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Repro Steps Cell

from transformers import pipeline

tmp_repro_data = ['What a wonderful creation. Art, in our house, comes in many colors Thanks to Crayola! We unfortunately never seem to have enough crayons, though!',
 'I purchased this to replace my 7 yr old video baby monitor that had been dropped too many times. I love this monitor. It also works with my old Summer monitor camera.',
 'This float is very comfortable (and we LOVE the cup holders) but the headrest is a little too high to be universally comfortable. I think letting some air out of it would solve the problem.',
 'I have Marmite on toast every morning. Although I have been told by an Australian friend that Marmite is not up to the standards of Vegemite, I cannot tell the difference. An English friend tells me Marmite is better. Go figure.:) I love it and make certain to never run out of Marmite. This is one of those',
 "This was the only reason I could get anything done once we got home. My daughter always wanted to be held, but once you would lay her under here (especially the mirror), she was mesmerized. I highly recommend buying the extra set of toys also! Even though she's older now and doesn't play with it as much - she still loves to take all of the toys with her and play with them alone!!!",
 'This is the best packaged coconut water that I have ever tasted. It reminds me of the fresh coconut water right from the tree that I used to have in Jamaica. I have tried other brands of coconut water, but none can compare to Vita Coco. That is the only brand that I will buy...other that buying the real green coconut.',
 "I specifically looked for this product online because I couldn't find it in the local drugstore any longer. This really does what it says it does - it gives you a matte finish which lasts pretty much all day - you will notice a difference when you use it and when you don't. If you tend to get oily skin during the day (T-zone or otherwise) this minimizes that significantly. I hope I can always find this product!",
 "We got this for my daughter's first birthday and she loves it. She can make the animals or just push the frog and dance to the songs. It's also easy to take with you and at home, she can carry the little farm house to another room if she wants. (It's been thrown all over and hasn't broken, which is another plus.) We may get tired of the same animal facts and songs but she never does.",
 'I love Bare Escentuals bare minerals products. I treated myself to 4 of the products. I found the accompanying brushes high in quality. I wash my brushes periodically to prevent break outs. The brushes do well. I have had many compliments on my complection. Even though I am older I still get breakouts. The minerals have helped to decrease the flare- ups.**** I can well identify with the comments by some customers about dry skin and looking older.**** I absolutely must use a moisturizer with each application. I wish so much this moisturizing issue would be addressed with TV presentations. Otherwise, without liberal application of a moisturizer, my skin would look extremely dry and chalky no matter how beautiful the glow! Also lines are quite visible if moisure is insufficient. In spite of all of this, I have found minerals to be a great makeup.It is worth the money and time to continue my use of a moisturizer routine I have used for years. Ann HannaI also wish the lids were plastic for easy washing after use. I use an alcohol wipe to clean the inside of the lids periodically.',
 "My 10 month old son loves this toy! It is one of his favorites. Not only does he like to put the shapes (and any other toy that will fit) in the holes, but he also loves to play with the shapes on the floor, especially the ones that shake and make noise. He also likes this toy a lot because he can open and close the lid of the drum, repeatedly putting in and taking out shapes and toys. The Shape Rattle 'n Roll has definitely been worth the mere five dollars that it cost!",
 "I have been looking a long time for gum that is not made of materials bad for your health. I'm not worried about taste, but this tastes good and more importantly for me it is healthy and chews well. Some of the healthy chewing gums just fall apart. to me, healthy chewing gum means it doesn't have sugar or the horrible chemicals you find in the sugarless gums sold at grocery stores.",
 'We adopted two cats from a rescue shelter, a male first and then a female a couple of days later. They got along okay in the beginning but became more and more jealous of each other. The male had to be the boss of everything...food, toys and attention. The female started getting back at him about two months later by wetting in his favorite hang-out spots on the floor and then on my husband\'s leather recliner. I tried Nature\'s Miracle on the carpet first but it didn\'t work. The smell was still there and she went back and wet on the same spot.After researching online and reading the reviews and tips from other customers, I ordered a gallon of Urine Off For Cats through Amazon, as well as a blacklight from Walmart and a big marinade infuser syringe from Linens N Things. The blacklight found spots we were unaware of, including under the recliner. I took masking tape and marked the area about 6" beyond each spot on the carpet and then marked spots about 4" apart within each circle. I poured about 3 cups of Urine Off into a 4 cup measuring cup to make it easier to draw the solution into the syringe. Then I injected each spot with a full syringe of Urine Off, marking each spot with an X in pen on the masking tape as I went along so I knew where I had already injected. Eventually the tip on the syringe was bending so I found that it was easier to use a big skewer to poke the hole first and then push the syringe into the carpet. When I finished injecting, I then filled a pump sprayer with the solution and saturated the top of the carpet. I covered the spots with plastic garbage bags for 2 days and then allowed them to air dry.For the leather recliner I had to pull the leather covers away from the back of the cushions and spray the leather both inside and out and around the zippers. I injected the cushions with the syringe like I did the carpet and put them in garbage bags. I also put a plastic tarp under the chair and sprayed everywhere the urine may have gone on top and underneath of the chair including any padding and all of the metal and springs. Then I covered it all with plastic bags for 2 days before letting it air dry. Check the cushions to be sure mold doesn\'t start growing. I removed them from the plastic early. I used a leather conditioner afterwards to restore the pliability to the leather. The metal underneath began to rust in some places, but it came off with WD 40 when we treated the hinges afterwards.I wish I could say I only had to do all of this work once, but I had to repeat it a second time before all of the smell was gone to my sensitive nose. To be fair, the directions say it may take two or more treatments for the smell to be eliminated. Also, I had used the Nature\'s Miracle on the two small spots I was aware of first which may have made it harder for the Urine Off to work. I\'m sure I didn\'t get down to the subfloor with the Nature\'s Miracle and I didn\'t cover it with plastic. In fact, I used a fan on it to dry it faster which I now know is the opposite of what you should do. But because the Urine Off directions said you had to saturate the carpet, the padding and the subfloor below the padding, and also to widen the area that you treat beyond the original spots, I had to buy a 2nd gallon. To repeat the process, I bought a 3rd gallon. But the end result is that we don\'t smell any urine odor. Only a slight lemony smell of the solution remains. I was able to save our $1800 recliner, but sadly, my husband insisted that our female cat go back to the rescue shelter. I\'m sure she will be much better off as an only cat who is queen of her castle just as our boy enjoys being king.',
 "We like trying new flours for our home made bread (mostly sourdough). Spelt works very well with sourdough starter. Gives the bread a subtle nut like flavor and fine texture. Plus, it doesn't affect the rise very much (we do add gluten to assist the sourdough rise). Of amaranth, teff, and spelt, we like spelt the best.",
 'One of the many things I can do during the day is use smaller amounts of paper to try to reduce my carbon footprint. I know it would be better to not use papertowels at all, but this is surely a good alternative for those of us on the path to global warming enlightenment.',
 'The product is wonderful - it leaves the laundry smelling very fresh. It is a pleasure to deal with the folks at [...]. They are very quick with their deliveries and responsive.',
 "I originally gave this product a 5 star review, but I was forced to edit the review after 3 months of use. I find the major problem with this item is that if you have a very messy diaper and can't tightly bundle it without a mess, the mess gets all over the module that dumps the diaper and the stink is OUTSIDE of the diaper pail.I used this pail with both cloth and disposable diapers (during different time periods). The first month it worked great for both, in my opinion, although it didn't hold near as many cloth diapers and they occasionally needed some help getting into the pail. However, once my baby grew into the next size of cloth diapers, it was IMPOSSIBLE to use the champ with them. Now, I understand this pail is not marketed for use with cloth diapers so I won't hold that against it, however just in case you are considering it for such a purpose as I did, DON'T.The last complaint I have with this pail is that after only 2 months of use the inner seal broke and fumes were all over the room. I did NOT call the company for a replacement as the other reviewer did, because I cannot use the pail with my cloth diapers.This pail has become garage sale fare.",
 "It's a nice replica toy for children. Good work with the full retractable blade, but not very shinny (the similar saber created before shines much more). a little big for kids, but fun to play with. my son loves it.",
 "This machine has plenty of power. I have only used it to make pizza dough and it worked extremely well. The dough came out great and I can't wait to use the shredding blades next.",
 'This pasta has a wonderful natural flavor and is naturally high in fiber and nutrients. I eat it plain sometimes, and use it in place of white rice and other more processed grains.',
 'I really recommend this product. The price on Amazon was a lot better than I could find in any store. The product arrived ahead of expected delivery time. It works really well, its quick to heat up and does a really good job of smoothing down my thick hair!']

p = pipeline(
    'zero-shot-classification',
    model='facebook/bart-large-mnli',
    device=0,
)


def _revised_forward(self, inputs):
    candidate_label = inputs["candidate_label"]
    sequence = inputs["sequence"]
    model_inputs = {k: inputs[k] for k in self.tokenizer.model_input_names}  #type: ignore
    outputs = self.model(**model_inputs, use_cache=False)

    model_outputs = {
        "candidate_label": candidate_label,
        "sequence": sequence,
        "is_last": inputs["is_last"],
        **outputs,
    }
    return model_outputs

# With this line it works as expected, without it memory spikes. The only difference between `revised_forward`
# and the transformers repo is that we pass `use_cache=False` as an extra arg to inference with `self.model`

#p._forward = _revised_forward.__get__(p)

p(
    tmp_repro_data,
    multi_label=True,
    candidate_labels=list(range(130)),
)

Expected behavior

Letting this script run as is causes memory (CPU memory, not GPU memory) to spike over 10Gi at around 1500 inference calls. This can break a lot of environments, especially anything involving running jobs on resource constrained machines.

After some debugging, we traced this to the past_key_values object being returned by the Bart model, which was a tuple of some very large tensors. We suspect that these large tensors are causing garbage collection to not be able to catch up when storing all of these model inference requests in a single list. Passing use_cache=False to model inference (and therefore not returning the past_key_values object) fixes the memory spikes, making us think this was indeed the issue.

@ydshieh
Copy link
Collaborator

ydshieh commented Jul 18, 2023

Hi @rsmith49

Thank you for opening this issue 🤗 . I will take a look!

@ydshieh ydshieh self-assigned this Jul 18, 2023
@rsmith49
Copy link
Author

rsmith49 commented Jul 18, 2023

could you confirm that the issue happens only when the results (of all 1500 inference calls) are saved? (I think it's yes?)

I have been running in a jupyter notebook, which I think does save the results from calling the pipeline since it is the final statement in the cell - let me try in a regular python process and see if the memory spikes the same.

I should note though that the "1500 inference calls" I mentioned are only over 20 documents - since there are 130 candidate_labels, the pipeline calls the model for inference 2600 times (130 * 20). So saving the results here will be 20 dicts with ranked scores for each candidate_label.

when you save the model inference results, do you also contain those returned past_key_values? (I guess not ..?)

Correct, the result from the pipeline does not contain the past_key_values. The "storing in a single list" code occurs here, and stepping through with pdb shows the iterator's internal function creating a reference to the past_key_values at each __next__ call

@ydshieh
Copy link
Collaborator

ydshieh commented Jul 18, 2023

Hi, I am not able to reproduce with the following (slightly modified) script (see at the end), running in python directly

iteration: 0

RAM: 4318.6015625 MB
timing: 18.248116 sec.

==============

iteration: 156
RAM: 4319.5 MB
timing: 18.464201 sec

It would be great if you can try to see if the issue happens with python script only.

However, this is a sequence classification model, and the past_key_values is not used by this model.
I will try to have a fix anyway. Thank you again for showing this issue to us!

from transformers import pipeline

tmp_repro_data = ['I purchased this to replace my 7 yr old video baby monitor that had been dropped too many times.'] * 20

ckpt = 'facebook/bart-large-mnli'
# ckpt = 'facebook/bart-base'

p = pipeline(
    'zero-shot-classification',
    model=ckpt,
    device="cuda",
    batch_size=20,
)

import pdb; pdb.set_trace()

def _revised_forward(self, inputs):
    candidate_label = inputs["candidate_label"]
    sequence = inputs["sequence"]
    model_inputs = {k: inputs[k] for k in self.tokenizer.model_input_names}  #type: ignore
    outputs = self.model(**model_inputs, use_cache=False)

    model_outputs = {
        "candidate_label": candidate_label,
        "sequence": sequence,
        "is_last": inputs["is_last"],
        **outputs,
    }
    return model_outputs

# With this line it works as expected, without it memory spikes. The only difference between `revised_forward`
# and the transformers repo is that we pass `use_cache=False` as an extra arg to inference with `self.model`

import psutil
import os
process = psutil.Process(os.getpid())

#p._forward = _revised_forward.__get__(p)

import datetime

for i in range(1000):
    s = datetime.datetime.now()
    o = p(
        tmp_repro_data,
        multi_label=True,
        candidate_labels=list(range(130)),
    )
    e = datetime.datetime.now()
    d = (e-s).total_seconds()
    mem = process.memory_info()[0] / float(2 ** 20)
    print(i)
    print(mem)
    print(d)
    print("=" * 80)

@rsmith49
Copy link
Author

rsmith49 commented Jul 18, 2023

Thanks for looking into this!

Weirdly, I also did not see memory spikes when using a single text snippet copied 20 times, only when using 20 unique strings (I'm guessing something to do with caching somewhere in either python, torch, or transformers that makes garbage collection more effective). So if you could try using the example list I posted above that may do it.

Haven't had a chance to run the script in a pure python process but will let you know when I do!

@ydshieh
Copy link
Collaborator

ydshieh commented Jul 18, 2023

That would be nice to know! (I am opening a PR soon anyway :-) )

@rsmith49
Copy link
Author

rsmith49 commented Jul 18, 2023

Ran the script as just python tmp_script.py and saw memory go as high as 7.9Gi before I killed the process, somewhere around 1187 samples (NOTE: same environment as above, transformers==4.27.4, python version 3.8.0). So it looks like it occurs not just when saving the result of p(...), and is not just an artifact of notebooks 👍

@ydshieh
Copy link
Collaborator

ydshieh commented Jul 19, 2023

go as high as 7.9Gi

You use the different 20 text sentences in tmp_repro_data, right?

(I am running with the same text repeated 20 times, with latest main of transformers.)

@rsmith49
Copy link
Author

You use the different 20 text sentences in tmp_repro_data, right?

Yes, not sure why repeating the same text doesn't trigger it, but I get the same result as you when using repeated text

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants