Skip to content

Conversation

@geoff-reid
Copy link
Contributor

@geoff-reid geoff-reid commented May 1, 2017

Modify ucs discovery josbs to allow discovery and cataloging of ucs
nodes in the same graph

@uppalk1 @keedya @tannoa2 @RackHD/corecommitters

return waterline.nodes.create(node);
return waterline.nodes.create(node)
.then(function(node) {
return node

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing semicolon.

);
self.context.physicalNodeList = _.flattenDeep(self.context.physicalNodeList.concat(
_.map(servers, function(it) {
return [_.get(it, 'id'), _.get(it.relations[0], 'targets') ]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing semicolon.

.spread(function(rackmounts, servers) {
self.context.physicalNodeList = (self.context.physicalNodeList || []).concat(
_.map(rackmounts, function(it) {
return _.get(it, 'id')

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing semicolon.

self._done(err);
});
return Promise.map(this.nodes, function(node) {
return self.catalogRackmounts(node)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing semicolon.

this.node = this.context.target;
this.nodes = (this.context.physicalNodeList || []).concat(this.context.logicalNodeList);
if(_.isEmpty(this.nodes)) {
this.nodes = [ this.context.target ]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing semicolon.

@geoff-reid geoff-reid force-pushed the ucs-discovery-changes branch from 03d1a0f to 65ba80d Compare May 1, 2017 19:23
@JenkinsRHD
Copy link
Contributor

BUILD on-tasks #57 : FAILURE

@JenkinsRHD
Copy link
Contributor

BUILD on-tasks #58 : FAILURE

BUILD on-tasks #58 Error Logs ▼Test Name: test_nodes_discovery Error Details: timeout waiting for task discovery -------------------- >> begin captured logging << -------------------- tests.api.v2_0.nodes_tests: INFO: Wait start time: 2017-05-01 15:51:24.727927 amqp: DEBUG: Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2013 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'consumer_priorities': True, u'consumer_cancel_notify': True, u'publisher_confirms': True}, u'platform': u'Erlang/OTP', u'version': u'3.2.4'}, mechanisms: [u'AMQPLAIN', u'PLAIN'], locales: [u'en_US'] amqp: DEBUG: Open OK! kombu: INFO: Starting AMQP worker -> graph.finished.*> amqp: DEBUG: Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2013 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'consumer_priorities': True, u'consumer_cancel_notify': True, u'publisher_confirms': True}, u'platform': u'Erlang/OTP', u'version': u'3.2.4'}, mechanisms: [u'AMQPLAIN', u'PLAIN'], locales: [u'en_US'] amqp: DEBUG: Open OK! kombu.mixins: INFO: Connected to amqp://guest:**@127.0.0.1:9091// amqp: DEBUG: using channel_id: 1 amqp: DEBUG: Channel open modules.worker: ERROR: subtask timeout after 1200 seconds, (id=discovery), stopping.. kombu: INFO: Stopping AMQP worker -> graph.finished.*> modules.worker: INFO: stopping subtask for discovery amqp: DEBUG: Closed channel #1 --------------------- >> end captured logging << --------------------- Stack Trace: File "/usr/lib/python2.7/unittest/case.py", line 331, in run testMethod() File "/usr/lib/python2.7/unittest/case.py", line 1043, in runTest self._testFunc() File "/home/jenkins/workspace/on-tasks/RackHD/test/.venv/on-build-config/local/lib/python2.7/site-packages/proboscis/case.py", line 296, in testng_method_mistake_capture_func compatability.capture_type_error(s_func) File "/home/jenkins/workspace/on-tasks/RackHD/test/.venv/on-build-config/local/lib/python2.7/site-packages/proboscis/compatability/exceptions_2_6.py", line 27, in capture_type_error func() File "/home/jenkins/workspace/on-tasks/RackHD/test/.venv/on-build-config/local/lib/python2.7/site-packages/proboscis/case.py", line 350, in func func(test_case.state.get_state()) File "/home/jenkins/workspace/on-tasks/RackHD/test/tests/api/v2_0/nodes_tests.py", line 126, in test_nodes_discovery message='timeout waiting for task {0}'.format(self.__task.id)) File "/home/jenkins/workspace/on-tasks/RackHD/test/.venv/on-build-config/local/lib/python2.7/site-packages/proboscis/asserts.py", line 67, in assert_false raise ASSERTION_ERROR(message) "timeout waiting for task discovery\n-------------------- >> begin captured logging << --------------------\ntests.api.v2_0.nodes_tests: INFO: Wait start time: 2017-05-01 15:51:24.727927\namqp: DEBUG: Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2013 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'consumer_priorities': True, u'consumer_cancel_notify': True, u'publisher_confirms': True}, u'platform': u'Erlang/OTP', u'version': u'3.2.4'}, mechanisms: [u'AMQPLAIN', u'PLAIN'], locales: [u'en_US']\namqp: DEBUG: Open OK!\nkombu: INFO: Starting AMQP worker -> graph.finished.*>\namqp: DEBUG: Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2013 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'consumer_priorities': True, u'consumer_cancel_notify': True, u'publisher_confirms': True}, u'platform': u'Erlang/OTP', u'version': u'3.2.4'}, mechanisms: [u'AMQPLAIN', u'PLAIN'], locales: [u'en_US']\namqp: DEBUG: Open OK!\nkombu.mixins: INFO: Connected to amqp://guest:**@127.0.0.1:9091//\namqp: DEBUG: using channel_id: 1\namqp: DEBUG: Channel open\nmodules.worker: ERROR: subtask timeout after 1200 seconds, (id=discovery), stopping..\nkombu: INFO: Stopping AMQP worker -> graph.finished.*>\nmodules.worker: INFO: stopping subtask for discovery\namqp: DEBUG: Closed channel #1\n--------------------- >> end captured logging << ---------------------"

@geoff-reid
Copy link
Contributor Author

test this please

context,
taskId);
this.node = this.context.target;
this.nodes = (this.context.physicalNodeList || []).concat(this.context.logicalNodeList);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From your implementation, some nodes are extracted from node relations, so there may be some duplicated nodes in this.nodes. You need to remove those duplicated nodes.

);
self.context.physicalNodeList = _.flattenDeep(self.context.physicalNodeList.concat(
_.map(servers, function(it) {
return [_.get(it, 'id'), _.get(it.relations[0], 'targets') ];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are some hidden rules in your design:

  • There is at least 1 relation
  • The relation type happens to be the "enclosure" relation
  • The newly created "enclosure" relation is happen to be the first relation.

If the node relation is empty, then you will get a "empty" node in final self.context.physicalNodeList, but you forgot to remove the empty node.

If the node relation is not "enclosure" relation or the "enclosure" relation is not at the first position, that means you are trying to handle wrong node, this will make the ucs catalog fail.

These kinds of hidden rules make the code error-prone, especially somebody tries to refactor the code that generating the ucs server node but he isn't aware such kind of hidden rules. So my suggestion is:

  • Filter the relation target by relation type, not use the fixed position.
  • Always check whether the relation is empty or not.

Copy link
Contributor

@brianparry brianparry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

.spread(function() {
.then(function(logicalServers) {
self.context.logicalNodeList = (self.context.logicalNodeList || []).concat(
_.map(logicalServers, function(it) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@geoff-reid You can use the _.property iteratee shorthand for this: _.map(logicalServers, 'id').

geoff-reid added 2 commits May 4, 2017 13:04
This fixes an issue where the termail state for a task using
waitingOn anyOf is not correctly calculated.
Modify ucs discovery josbs to allow discovery and cataloging of ucs
nodes in the same graph
@geoff-reid geoff-reid force-pushed the ucs-discovery-changes branch from 65ba80d to b17e2fd Compare May 4, 2017 17:11
self.context.physicalNodeList = _.flattenDeep(self.context.physicalNodeList.concat(
_.map(servers, function(it) {
return [_.get(it, 'id'),
_.get(_.find(it.relations, {"relationType": "encloses"}), 'targets')];

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Line is too long.

@geoff-reid geoff-reid force-pushed the ucs-discovery-changes branch from b17e2fd to f8e9b1f Compare May 4, 2017 17:44
@brianparry brianparry merged commit 609edd0 into RackHD:master May 4, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants