Conversation
… any more in any config
…n used because they lack the consumes call
…ce it lacks the consumes call
|
master in Artus |
|
Dear Raphael, does dictchanges branch on HTT not compiling? Do you scramb it together with dictchanges branch of Kappa? |
|
Hi, I used the CMSSW8 test file, ran it through Kappa and the Run2SMAnalysis. The output is, regarding to compareRootFiles.py, identical. To prevent further time-consuming merge conflicts I urge a bit and set the timeline for vetos to tomorrow at 12:00. Please let me know if you have any obligations. Before merging to the master then I'll provide the manual testing procedure I already proposed, but this I will do on the dictchanges branch. Cheers, |
|
Hallo Raphael, I fear I have to redo the skim, which I did over the weekend and for which I already committed filelists on the tmpdictchanges branch in the kithiggs repository, in order to be able to continue usign this branch. In addition to testing the changes here I cannot do that until tomorrow noon. Regards, |
|
I think another minimum requirement would be the test of CMSSW 7-9 (via travis). |
|
Hi Thomas, The merge in the KITHiggs and Artus Repositories can wait. In KITHiggs only one line changed: So take your time testing your skim and the changes in Kappa. I just wanted to make sure, as you requested, that there's a full setup from Kappa to Artus output n-tuple. From the Kappa side I don't see the need to wait. The basic tests succeeded and it's just a merge to another development branch, nothing that will be used for production. Cheers |
|
You take the travis configuration from the dictchanges branch without modifications. This is why only this branch is checked out and tested. |
Conflicts: .travis.yml
|
Thank you for pointing that out. It won't happen again: With the fixed config and locally it also runs fine. |
|
Hi Raphael, one question: what is the time scale for having Artus ready to run with the new skimming of Kappa? |
|
Hi Claudia, the timescale is yesterday. Use the getByToken Branches in KITHiggs, KappaTools and the Artus repository. Cheers |
|
Ok, I will try that, thanks. Cheers |
|
Hi Raphael, I just merged the getByToken to dictchanges in Htt, since I am the only one who works on it now - my code worked with no issues and nothing changed. Thank you, |
|
Hallo Raphael, sorry, I was busy with other work the last few hours since the PR was announced. Now I checked it and see problems when running with increased verbosity (2) using this testing command:
Olena confirms this. I do not get this problem before the merge of this PR. Is it allowed to ask why we need to merge major changes under this pressure; whether we can urge you to provide a bugfix on the same time scale? Regards, |
|
Hi Thomas, I just ran the command on the dictchanges branch and do not see the error. Did you recompile Kappa, maybe even with a scram b clean before? This error is exactly the reason the getByToken Branch required the extension of the Trees saved to the Kappa output. At the moment I can't see what should be the difference between the SUSY160GeV sample and the DY sample. The PR was already there for 6 days. It passed all tests that you requested, and it was only a PR to the dictchanges branche where one usually not even has to do PRs. I got more and more busy solving merge conflicts and directly adjusting commits that just have been made. The testing setup is there and if problems arise I'm taking care of them. Same goes for the problem you posted above, just provide me with a reproducible example and I'll fix it. Cheers |
|
The problem occurs only with increased verbosity and remains after scram b clean + scram b. |
|
Thank you for the clarification. So it was also there in the standard test case and not something special to the wiki command. it's fixed: ca1e899 |
|
Hallo Raphael, I see, that Regards, |
|
Hallo Raphael, I see, that the We (everybody who relys on the dictchanges branch) are forced to use these trees due to you rushed and self-accepted merge of this PR. We had samples to use in Artus before this merge (as already written), but we cannot use them any longer because of additional dictchanges. The weekend is a perfect chance for a new skim (that we are forced to do due to this merge), but we need to check a test output of Kappa on Artus level first. You can imagine, that a fix of this issue is highly appreciated before the weekend (=today). Regards, |
|
did you notice that that KLumiInfo (collection name lumiInfo) is still saved? It makes not sense at all to fill the lumi number in the Run tree... |
|
Therefore I wrote:
I forgot to mention, that the |
|
Dear Raphael, all I see, that there is no reaction (appart from an assertion in KappaTools) so far regarding what I wrote last. I understand, that the idea is still not finalised and therefore reverted the merge of this PR in order to give it more time for more detailed discussions and checks. I understand, that we are somehow forced by updates in CMSSW to write out some quantities in the In this sense, we should not restrict the code to run only on files, that contain only events from a single run but we should implement the same mechanism we already have for lumi sections and the What do you and others think? Regards, |
This reverts commit fe54fcb.
* minor: a wrapper to use cout more convenient * KPatTauProducer: additional members for KTau * collection of both signal and isolation pions for each tau candidate * collection of associated to pions track todo: store the corresponding mappings * KPatTauProducer: add to prev commit * store isolationChargedHadronCandidates sorted * KPFCandidate: add KTracks as a member * fixed the bug in signalNeutrHadrCands * add parameter to specidy in PS if the members needed for Kaons should be stored are (no by default) * cosmetics * KPatTauProducer: remove unused members * dataset.json: add cs to QCD FALL15 MC nickname: QCD_RunIIFall15MiniAODv2_PU25nsData2015v1_13TeV_MINIAOD_pythia8 xsec: 381304 as set [here](https://github.com/cms-tau-pog/jetToTauFakeRate/blob/master/data/wjets_samples.json#L25): ``` "dtag":"MC13TeV_QCDMuEnriched", "split":110, "xsec":381304, "br":[ 1.0 ] , "dset":["/QCD_Pt-20toInf_MuEnrichedPt15_TuneCUETP8M1_13TeV_pythia8/RunIIFall15MiniAODv2-PU25nsData2015v1_76X_mcRun2_asymptotic_v12-v1/MINIAODSIM"] ``` * Add more fields KBeamSpot: * rotatedCovariance3D KTrack: * innerPosition * charge is stored in short now * Correct KTrack remove innerPosition as unaccessible * KBasic * rotatedCovariance3D is strange behaving - removed * first step: most often called getByLabel to getByToken * KTowerProducer * KTaupairVerticesMapProducer * Muon Producer, dropping the srcMuonIsolationPF support since not used any more in any config * removed producer that is already deactivated for a long time * remove sigma from pileup density * deactivate Producer that is not used any more * FilterSummary Producer. Deactivating parts that probably have not been used because they lack the consumes call * KBasicTauProducer * TriggerObjectStandaloneProducer * deactivate JetID from CaloJetProducer -> was disfunctional anyway since it lacks the consumes call * deactivate ExtendedTauProducer -> no analysis is using it atm * remove precompiler statements for KDataInfoProducer * KInfoProducer * remove non-functional and hardly configurable code parts * ElectronProducer * KGenInfoProducer * move from local to global settings * switch to usual reco::VertexCollection like in KElectronProducer * reconfigure muons * change to proper tree * change path * checkoutCmssw*PackagesForSkimming.sh: adjust Kappa branches to avoid failing tests. Revert this commit before merging into the master * use the inputTag class member * KTrack IP corrected * the vertice with respect the IP parameters are estimated is stored * first Valid collection vertice is considered ATTENTION: when calculating this the TransientTrackBuilder has to be passed with the reco::Vertex collection * KPFTau expanded * for each tau-candidate all charged particles store additionaly their IP with respect to the first valid reco::Vertex * added TransienTrackBuilder member * added tokens to access new collections * KTrackProducer.h, KTrack.h: provide access to track parameters and save their covariance matrix * KTrackProducer.h: write out covariance matrix for all tracks * KTrack.h: remove redundant information and offer get functions instead * Delete .nfs00000002a3c27b1900003508 remove commited binary - sorry for that * Revert "minor: a wrapper to use cout more convenient" This should fix the compiler warnings. This reverts commit fcd39e2. * KTrack.h, KTrackProducer.h: save B field at reference point * KMuonProducer.h: write out extended track info (no vertex-related infos added yet) * KTau.h, KPatTauProducer.h: preparations for saving SV of 3-prong taus * KPatTauProducer.h: got SV refitting working * KBasic.h: set default values of some KVertex members * Extend gitignore: no .nfs* * Kaons: added new types KKaons and KTransTrack to help reconstruct the theon Peak * Kaons: temp enable the kaon production * Kaons: add more new fields to the Kaon objects and fill them TODO: fix the empty histograms * Revert "Kaons: temp enable the kaon production" This reverts commit d8ff77a. * Kaons: turn on * Kaons: extend members * KTrack: add function to get valid vertex index. TODO:move it * KTrackProducer.h: simplify parameters of KTrackProducer::fillTrack * KElectronProducer.h, KPatTauProducer.h: switch on extended track infos for all electron and tau tracks * KElectronProducer.h: fix indentation * Kaon: add needed variables * deactivate non-used collectio * renaming _run_tree to _lumi_tree because that's what it actually is. * adding a Run Tree, able to be filled within endRun * KTrack: remove unnecessary member * introducing the run tree * Kaons: add more temp members, cleanup * changing the Info producers to write to the run tree. Necessary to be future-proof when accessing endRun will become more restrictive * VertexSummary: cosmetics * VertexSummary: start to add the validity check * KTrack: simplify some functions * KRoot: clean it readable * Kaon: dirty working version of code * manual adjustements to merge commit * KTrack: remove the getValidVertex function, use one from KVersexSummaryProducer instead * Add covariance3D to KBeamSpot(temp.), use typedefs * Kaons: * clean up a bit * add some TEMP variables for the checks * all the crusial variables stored * Kaons: the tracks should at least point in the same quadrant * Kaons: add the full PFCandidate * Tests: download files only once * added block function to calculate IP wrt RefitPV * still trying to fetch RefitVertexCollection * fixed the problem: now able to access RefitVertexCollection in other producers * additional changes * added fillIPInfo call in ele, mu and tau KProducers * automatically determine the branch from Travis Conflicts: .travis.yml * Kaons: enable only for data and QCD * DatasetManager: clean up * fix bug on verbosity > 1 * remove precompiler statements catching differences before CMSSW76X * Kaons: remove unused members * KGenInfoProducer.h: less debugging output w.r.t. LHE weights * KGenInfoProducer.h: extract mapping between human readable names and indices for LHE weights * KGenInfoProducer.h: extract mapping between human readable names and indices for LHE weights, minor improvements * KInfo.h: add plural "s" to std::vector<float> KGenEventInfo::lheWeight * KGenInfoProducer.h: move verbosity check in preparation for saving of new members of KGenRunInfo * KGenInfoProducer.h: fill std::map<std::string, std::string> KGenRunInfo::lheWeightNamesMap * KInfo.h: extended functions to access LHE weights * Revert "Merge pull request #51 from KappaAnalysis/getByToken" This reverts commit 12ab11c, reversing changes made to c74bf50. Conflicts: Producers/interface/KGenInfoProducer.h Producers/interface/KMuonProducer.h * KInfo.h: modify unreasonable assertion * Revert "Revert "Merge pull request #51 from KappaAnalysis/getByToken"" This reverts commit fe54fcb. * KInfo.h: remove check, whether weights are within certain/arbitrary range * KInfo.h: getLheWeightNamesMap returns again the requested name (not the mapped internal name as before) * LinkDef.h: remove file which is not needed any more * KInfoProducer.h, KInfo.h: create/fill/save Runs tree based on the same principle as the Lumis tree * KElectronProducer.h, KMuonProducer.h: minor changes (comments, indentaion) * KLeptonPairProducer.h: change to getByToken * KExtendedTauProducer.h: change to getByToken * KPatTauProducer.h: change to getByToken * KCaloJetProducer.h, KFilterSummaryProducer.h: added comments * Last update of Kaon production, sync with getByToken * removing log output (1) * Test script is cleaned up * KPatTau: revert non-working getByToken * Remove all Kaon information * TESTS: enable more output * KGenInfoProducer.h: fix LHE weight regex, keyword in weightgroup can be either "type" or "name" * kSkimming_run2_cfg.py: improve LHE settings * checkout script 8: comment the dublicating package checkout * Update test_build.sh add wget installation * clean up * kSkimming_run2_cfg.py: store LHE weights and particles only for Z/H->tautau events and LFV samples * KPatTauProducer.h: fixed clean up commit * fix the muon configuration * fix config for electrons * Add exceptions calls * added missing parts for srcMuonIsolationPF pset and some more updates * Let git ignore vim swap files * Whitespace fixes on datasets.json * Remove 2017 datasets from database Conflicts: Skimming/data/datasets.json * Readd 2017 datasets (data only) up to D v1 * Add missing global tags to 2017 data * Add Summer17 MC samples: DYJetsToLL, WJetsTLNu, TT * Added EmbeddingRun2017B MuTau dataset * SkimManager.py: simplify calls of the crab_cmd function, version with pool was lacking a proper timeout * datasetsHelperTwopz.py: sort nick lists for a more deterministic behaviour * datasets.json: remove non-existing sample * kSkimming_run2_cfg.py: minor change * kSkimming_run2_cfg.py: put lines to selection few events on a safer place * SkimManager.py: enabled forced submission of jobs in exception state * KPatTauProducer.h: fix indentation * KTau.h, KPatTauProducer.h: added refitted a1 properties * KTau.h: added resonancePdgId function * Fix config file to process in 9XX
Hi,
a change in the Kappa data format was unavoidable: Accessing a InRun product is only available in the endRun method. EndRun is only called when already several times the lumi tree has been written. All other solutions than to introduce also a run tree would have been nasty fixes.
I would like to merge this branch into the dictchanges branch asap.
I'll prepare the necessary changes to Artus in the next days to use its output.
Cheers