Project

General

Profile

Actions

Bug #15128

closed

Boot disk based imaging "operation not permitted" Katello 3.0 RC5

Added by Dylan Baars almost 8 years ago. Updated almost 6 years ago.

Status:
Closed
Priority:
High
Assignee:
Category:
-
Target version:
Difficulty:
Triaged:
Fixed in Releases:
Found in Releases:

Description

Since upgrading to Katello 3.0 RC5, I am unable to use boot disk based imaging -

The VM is created fine, but as it tries to boot I receive the following message on the VM console

http://wellkatellodev.niwa.local/unattended/iPXE?token=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX.... OK
http://wellkatellodev.niwa.local/pulp/repos/NIWA/Library/custom/CentOS7/os_x86_64//images/pxeboot/vmlinuz... Operation not permitted (http://ipxe.org/410c613c)
Could not boot: Operation not permitted (http://ipxe.org/410c613c)


Files

boot_disk_failure.JPG View boot_disk_failure.JPG 100 KB Dylan Baars, 05/22/2016 11:17 PM
Actions #1

Updated by Justin Sherrill almost 8 years ago

Does your apache logs show requests for those files? You might wanna 'tail -f -n 0 /var/log'/httpd/*' and try to boot.

Also what did you upgrade from?

Actions #2

Updated by Dylan Baars almost 8 years ago

Justin Sherrill wrote:

Does your apache logs show requests for those files? You might wanna 'tail -f -n 0 /var/log'/httpd/*' and try to boot.

Also what did you upgrade from?

Hi Justin,

well its working now. I can only guess it was a reboot that "fixed" it. Sorry! I upgraded from RC4

Actions #3

Updated by Eric Helms almost 8 years ago

  • Status changed from New to Rejected
  • translation missing: en.field_release changed from 86 to 114
Actions #4

Updated by Dylan Baars almost 8 years ago

Hi this error has reappeared after upgrading to the latest RC - a new VM created via the Hosts > New Host pages using bootdisk-based imaging. Tailing the httpd logs gives the below:

> /var/log/httpd/foreman_access.log <
192.168.14.131 - - [13/Jun/2016:14:05:01 1200] "GET /unattended/iPXE?token=0cca23bb-4e49-4fe3-8696-e4bbbe8633d3&mac=00%3A50%3A56%3A85%3A7c%3Ac5 HTTP/1.1" 200 515 "-" "iPXE/1.0.0 (c4bce43)"
192.168.14.131 - - [13/Jun/2016:14:05:01 1200] "GET /pulp/repos/NIWA/Prod-Server/CentOS7_Server/custom/CentOS7/os_x86_64/images/pxeboot/vmlinuz HTTP/1.1" 302 - "-" "iPXE/1.0.0 (c4bce43)" > /var/log/httpd/foreman_error.log <
[Mon Jun 13 14:05:02.378626 2016] [:error] [pid 2701] [client 192.168.14.131:22683] mod_wsgi (pid=2701): Exception occurred processing WSGI script '/usr/share/pulp/wsgi/streamer_auth.wsgi'.
[Mon Jun 13 14:05:02.378815 2016] [:error] [pid 2701] [client 192.168.14.131:22683] Traceback (most recent call last):
[Mon Jun 13 14:05:02.378860 2016] [:error] [pid 2701] [client 192.168.14.131:22683] File "/usr/share/pulp/wsgi/streamer_auth.wsgi", line 36, in allow_access
[Mon Jun 13 14:05:02.378996 2016] [:error] [pid 2701] [client 192.168.14.131:22683] url.validate(key, remote_ip=remote_ip)
[Mon Jun 13 14:05:02.379032 2016] [:error] [pid 2701] [client 192.168.14.131:22683] File "/usr/lib/python2.7/site-packages/pulp/server/lazy/url.py", line 587, in validate
[Mon Jun 13 14:05:02.379288 2016] [:error] [pid 2701] [client 192.168.14.131:22683] policy, signature = self.bundle
[Mon Jun 13 14:05:02.379319 2016] [:error] [pid 2701] [client 192.168.14.131:22683] File "/usr/lib/python2.7/site-packages/pulp/server/lazy/url.py", line 565, in bundle
[Mon Jun 13 14:05:02.379364 2016] [:error] [pid 2701] [client 192.168.14.131:22683] query = Query.decode(self.content.query)
[Mon Jun 13 14:05:02.379433 2016] [:error] [pid 2701] [client 192.168.14.131:22683] File "/usr/lib/python2.7/site-packages/pulp/server/lazy/url.py", line 363, in decode
[Mon Jun 13 14:05:02.379475 2016] [:error] [pid 2701] [client 192.168.14.131:22683] k, v = pair.split('=')
[Mon Jun 13 14:05:02.379500 2016] [:error] [pid 2701] [client 192.168.14.131:22683] ValueError: too many values to unpack
[Mon Jun 13 14:05:02.379539 2016] [:error] [pid 2701] [client 192.168.14.131:22683] mod_wsgi (pid=2701): Client denied by server configuration: '/var/www/streamer/var'. > /var/log/httpd/foreman_access.log <
192.168.14.131 - - [13/Jun/2016:14:05:01 1200] "GET /streamer/var/lib/pulp/content/distribution/ks-CentOS--7-x86_64/images/pxeboot/vmlinuz?policy=eyJleHRlbnNpb25zIjogeyJyZW1vdGVfaXAiOiAiMTkyLjE2OC4xNC4xMzEifSwgInJlc291cmNlIjogIi9zdHJlYW1lci92YXIvbGliL3B1bHAvY29udGVudC9kaXN0cmlidXRpb24va3MtQ2VudE9TLS03LXg4Nl82NC9pbWFnZXMvcHhlYm9vdC92bWxpbnV6IiwgImV4cGlyYXRpb24iOiAxNDY1NzgzNTkxfQ==;signature=ANlO8fxrFZ3mi9J8bf64XGBOBeM3Wal49VDtZPlJvwpa2X7ezF6tl8jfC7RKrjwWuWxAotT8UMEKn4foZqodZogao4HGaQkddkcFAPrZ53OjYijF_3P4h8fETImWC2cJkY4Cq0lbNi2tQ96dLe7nEEVioXwN1jYOsm42ZBbKbq3wapTU3bAtnSSzD3AjF9G4n9KRJ-YZLmuk1DNxsKI0sMdjiGKiWqz7jJXyji7pkMP_QoGAhhudQwsdmVzB9H3BEEqFoXYmM0Zl7kqB6sB8Msn-UMPoVyOd1GaJ7Wc-FEs9QQ_CnxgF8xEkaMadM2DbbYRB8R-CUb2NbvH2WRZIPQ== HTTP/1.1" 403 287 "-" "iPXE/1.0.0 (c4bce43)"
Actions #5

Updated by Eric Helms almost 8 years ago

  • Status changed from Rejected to New
  • translation missing: en.field_release deleted (114)
Actions #6

Updated by Jonathan Dean almost 8 years ago

We had the same issue on our installation after migration to 3.0 RC5 to 2.3. Our issue was that the kickstart repository had become corrupted. To fix we simply disabled/re-enabled the kickstart repository.

Actions #7

Updated by Jonathan Dean almost 8 years ago

to 3.0RC5 from 2.3 (for some reason I can't edit my previous post)

Actions #8

Updated by Dylan Baars almost 8 years ago

Jonathan Dean wrote:

We had the same issue on our installation after migration to 3.0 RC5 to 2.3. Our issue was that the kickstart repository had become corrupted. To fix we simply disabled/re-enabled the kickstart repository.

Hi Jonathan,

not sure what you mean - I can't see anywhere under products, content views, installation media where I can disable/enable a repository. Help?

Thanks,
Dylan

Actions #9

Updated by Jonathan Dean almost 8 years ago

Dylan Baars wrote:

Jonathan Dean wrote:

We had the same issue on our installation after migration to 3.0 RC5 to 2.3. Our issue was that the kickstart repository had become corrupted. To fix we simply disabled/re-enabled the kickstart repository.

Hi Jonathan,

not sure what you mean - I can't see anywhere under products, content views, installation media where I can disable/enable a repository. Help?

Thanks,
Dylan

Look under Content > Red Hat Repositories. Then go to kickstarts and find your kickstart repository(ies) - un-check and re-check it.

Actions #10

Updated by Dylan Baars almost 8 years ago

Right, so I'm using CentOS on my dev box, not RHEL. On checking things further, one of the content views seemed to be corrupt - could not publish new versions. I created a new content view for CentOS7 Server, moved my test boxes to using it and the same for hostgroup configuration. I also had to make sure in each hostgroup that there was no media selected on the operating system tab. After doing those two things, I could remove all content view versions for that content view - on trying to delete the content view completely it gives me the error

"Cannot delete record because of dependent content_facets"

/var/log/foreman/production.log output when I try and do so:

2016-06-16 15:54:01 [app] [I] Started GET "/errata/views/errata-counts.html" for 192.168.222.132 at 2016-06-16 15:54:01 +1200
2016-06-16 15:54:01 [app] [I] Started GET "/katello/api/v2/content_view_versions?content_view_id=4&page=1&search=" for 192.168.222.132 at 2016-06-16 15:54:01 +1200
2016-06-16 15:54:01 [app] [I] Processing by Katello::Api::V2::ContentViewVersionsController#index as JSON
2016-06-16 15:54:01 [app] [I] Parameters: {"content_view_id"=>"4", "page"=>"1", "search"=>"", "api_version"=>"v2"}
2016-06-16 15:54:01 [app] [I] Rendered /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/views/katello/api/v2/content_view_versions/index.json.rabl within katello/api/v2/layouts/collection (3.2ms)
2016-06-16 15:54:01 [app] [I] Completed 200 OK in 29ms (Views: 5.5ms | ActiveRecord: 3.4ms)
2016-06-16 15:54:01 [app] [I] Processing by Katello::Api::V2::ContentViewsController#show as JSON
2016-06-16 15:54:01 [app] [I] Parameters: {"organization_id"=>"6", "api_version"=>"v2", "id"=>"4"}
2016-06-16 15:54:02 [app] [I] Rendered /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/views/katello/api/v2/content_views/show.json.rabl within katello/api/v2/layouts/resource (111.7ms)
2016-06-16 15:54:02 [app] [I] Completed 200 OK in 133ms (Views: 107.3ms | ActiveRecord: 15.5ms)
2016-06-16 15:54:08 [app] [I] Started GET "/content-views/deletion/views/content-view-deletion.html" for 192.168.222.132 at 2016-06-16 15:54:08 +1200
2016-06-16 15:54:08 [app] [I] Started GET "/katello/api/v2/content_view_versions?content_view_id=4&page=1&search=" for 192.168.222.132 at 2016-06-16 15:54:08 +1200
2016-06-16 15:54:08 [app] [I] Processing by Katello::Api::V2::ContentViewVersionsController#index as JSON
2016-06-16 15:54:08 [app] [I] Parameters: {"content_view_id"=>"4", "page"=>"1", "search"=>"", "api_version"=>"v2"}
2016-06-16 15:54:08 [app] [I] Rendered /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/views/katello/api/v2/content_view_versions/index.json.rabl within katello/api/v2/layouts/collection (13.8ms)
2016-06-16 15:54:08 [app] [I] Completed 200 OK in 63ms (Views: 25.2ms | ActiveRecord: 5.1ms)
2016-06-16 15:54:09 [app] [I] Started DELETE "/katello/api/v2/content_views/4?organization_id=6" for 192.168.222.132 at 2016-06-16 15:54:09 +1200
2016-06-16 15:54:09 [app] [I] Processing by Katello::Api::V2::ContentViewsController#destroy as JSON
2016-06-16 15:54:09 [app] [I] Parameters: {"organization_id"=>"6", "api_version"=>"v2", "id"=>"4"}
2016-06-16 15:54:10 [foreman-tasks/action] [E] Cannot delete record because of dependent content_facets (ActiveRecord::DeleteRestrictionError) | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/associations/has_many_association.rb:13:in `handle_dependency' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/associations/builder/association.rb:135:in `block in add_before_destroy_callbacks' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:440:in `instance_exec' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:440:in `block in make_lambda' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:160:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:160:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:166:in `block in halting' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:86:in `call' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activesupport-4.1.5/lib/active_support/callbacks.rb:86:in `run_callbacks' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/callbacks.rb:292:in `destroy' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/transactions.rb:263:in `block in destroy' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/transactions.rb:329:in `block in with_transaction_returning_status' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/connection_adapters/abstract/database_statements.rb:199:in `transaction' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/transactions.rb:208:in `transaction' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/transactions.rb:326:in `with_transaction_returning_status' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/transactions.rb:263:in `destroy' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/models/katello/model.rb:7:in `destroy!' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/katello/content_view/destroy.rb:27:in `finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:528:in `block (2 levels) in execute_finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/remote_action.rb:20:in `block in finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/remote_action.rb:40:in `block in as_remote_user' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/models/katello/concerns/user_extensions.rb:20:in `cp_config' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/remote_action.rb:27:in `as_cp_user' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/remote_action.rb:39:in `as_remote_user' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/remote_action.rb:20:in `finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action/progress.rb:30:in `with_progress_calculation' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action/progress.rb:22:in `finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/keep_locale.rb:15:in `block in finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/keep_locale.rb:22:in `with_locale' | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/lib/actions/middleware/keep_locale.rb:15:in `finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:38:in `finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/world.rb:30:in `execute' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:527:in `block in execute_finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:419:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:419:in `block in with_error_handling' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:419:in `catch' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:419:in `with_error_handling' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:526:in `execute_finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/action.rb:260:in `execute' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:9:in `block (2 levels) in execute' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/execution_plan/steps/abstract.rb:155:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/execution_plan/steps/abstract.rb:155:in `with_meta_calculation' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:8:in `block in execute' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:22:in `open_action' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:7:in `execute' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:68:in `run_step' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:53:in `dispatch' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:60:in `block in run_in_sequence' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:60:in `each' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:60:in `all?' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:60:in `run_in_sequence' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:49:in `dispatch' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:27:in `block in finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:46:in `finalize_phase' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:46:in `finalize_phase' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:46:in `finalize_phase' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:26:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware.rb:17:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/common/transaction.rb:16:in `block in rollback_on_error' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/connection_adapters/abstract/database_statements.rb:201:in `block in transaction' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/connection_adapters/abstract/database_statements.rb:209:in `within_new_transaction' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/connection_adapters/abstract/database_statements.rb:201:in `transaction' | /opt/rh/rh-ror41/root/usr/share/gems/gems/activerecord-4.1.5/lib/active_record/transactions.rb:208:in `transaction' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/transaction_adapters/active_record.rb:5:in `transaction' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/common/transaction.rb:15:in `rollback_on_error' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/common/transaction.rb:9:in `finalize_phase' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/stack.rb:22:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/middleware/world.rb:30:in `execute' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/sequential_manager.rb:26:in `finalize' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/worker.rb:18:in `block in on_message' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matchers/abstract.rb:74:in `block in assigns' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matchers/abstract.rb:73:in `tap' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matchers/abstract.rb:73:in `assigns' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matching.rb:56:in `match_value' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matching.rb:36:in `block in match?' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matching.rb:35:in `each' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matching.rb:35:in `match?' | /opt/theforeman/tfm/root/usr/share/gems/gems/algebrick-0.7.3/lib/algebrick/matching.rb:23:in `match' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/executors/parallel/worker.rb:12:in `on_message' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/context.rb:46:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/executes_context.rb:7:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.10/lib/dynflow/actor.rb:26:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/awaits.rb:15:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/sets_results.rb:14:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/buffer.rb:38:in `process_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/buffer.rb:31:in `process_envelopes?' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/buffer.rb:20:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/termination.rb:55:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/removes_child.rb:10:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/sets_results.rb:14:in `on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:161:in `process_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:95:in `block in on_envelope' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:118:in `block (2 levels) in schedule_execution' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/synchronization/mri_lockable_object.rb:38:in `block in synchronize' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/synchronization/mri_lockable_object.rb:38:in `synchronize' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/synchronization/mri_lockable_object.rb:38:in `synchronize' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:115:in `block in schedule_execution' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:18:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:18:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:96:in `work' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:77:in `block in call_job' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:333:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:333:in `run_task' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:322:in `block (3 levels) in create_worker' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:305:in `loop' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:305:in `block (2 levels) in create_worker' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:304:in `catch' | /opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:304:in `block in create_worker' | /opt/theforeman/tfm/root/usr/share/gems/gems/logging-1.8.2/lib/logging/diagnostic_context.rb:323:in `call' | /opt/theforeman/tfm/root/usr/share/gems/gems/logging-1.8.2/lib/logging/diagnostic_context.rb:323:in `block in create_with_logging_context'
2016-06-16 15:54:10 [app] [I] Rendered /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.rc8/app/views/katello/api/v2/common/async.json.rabl within katello/api/v2/layouts/resource (63.7ms)
2016-06-16 15:54:10 [app] [I] Completed 202 Accepted in 414ms (Views: 64.1ms | ActiveRecord: 66.8ms)

However, I have tried again to create a new host - and it is now booting successfully and imaging as I write this. After install I found one of the other repositories had corrupted (saltstack one) - trying to install salt-minion gave the following

Downloading packages:
PyYAML-3.11-1.el7.x86_64.rpm FAILED
python-tornado-4.2.1-1.el7.x86 FAILED 0% [ ] 0.0 B/s | 0 B --:--:-- ETA
python-zmq-14.7.0-1.el7.x86_64 FAILED 0% [ ] 0.0 B/s | 0 B --:--:-- ETA
(1/3): python-zmq-14.7.0-1.el7.x86_64.rpm 0% [ ] 0.0 B/s | 0 B --:--:-- ETA

Error downloading packages:
python-zmq-14.7.0-1.el7.x86_64: [Errno 256] No more mirrors to try.
PyYAML-3.11-1.el7.x86_64: [Errno 256] No more mirrors to try.
python-tornado-4.2.1-1.el7.x86_64: [Errno 256] No more mirrors to try.

I'm currently removing that repository from my content view, will delete it and re-try. Seems like the upgrade from RC5 to RC8 did something weird with my repositories/content views!

Actions #11

Updated by Justin Sherrill almost 8 years ago

  • Assignee set to Chris Duryee
Actions #12

Updated by Dylan Baars almost 8 years ago

Further update, the imaged VMs were not being attached using subscription-manager. Error is:

[root@testweb01d ~]# subscription-manager refresh
This system is not yet registered. Try 'subscription-manager register --help' for more information.
[root@testweb01d ~]# subscription-manager register --org="NIWA" --activationkey="Apache Production Servers"
Validation failed: Base Content view 'CentOS7 Server' is not in environment 'Prod-Server'

Even though the activation key was configured to point at the newly created content view. I created a new activation key to test, some result

[root@testweb01d ~]# subscription-manager register --org="NIWA" --activationkey="NIWA Apache Production Servers"
Validation failed: Base Content view 'CentOS7 Server' is not in environment 'Prod-Server'
[root@testweb01d ~]#

So I went back to my content views, cancelled the task that was stuck attempting to delete the "CentOS7 Server" content view (which as per the above didn't have any versions associated with it anymore). Then used Pulp-admin to delete the repositories that were still hanging around -

pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-centosplus_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-os_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-extras_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-updates_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-cr_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-EPEL7-EPEL7_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Postgres-Postgres_9_4_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Foreman-Foreman_1_10
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Foreman-Foreman_plugins
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Jenkins-Jenkins_LTS
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Katello_Agent-Katello_Agent_3_0_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Softwarecollections-rhscl-ruby193-epel-7-x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-NIWA-sdt_apps
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-extras_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-EPEL7-EPEL7_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Foreman-Foreman_plugins
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-os_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-centosplus_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-updates_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-CentOS7-cr_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Jenkins-Jenkins_LTS
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Foreman-Foreman_1_10
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Postgres-Postgres_9_4_x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-NIWA-sdt_apps
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Softwarecollections-rhscl-ruby193-epel-7-x86_64
pulp-admin rpm repo delete --repo-id=NIWA-Library-CentOS7_Server-Katello_Agent-Katello_Agent_3_0_x86_64

(note: Not all were still there, but a large number were)

and published a new version - which succeeded. I then published this through my environment to the "Prod-Server" environment, and attempted to subscribe a system again

[root@testweb01d ~]# subscription-manager register --org="NIWA" --activationkey="Apache Production Servers"
The system has been registered with ID: e61d4bf3-fd6e-4a83-a9dd-d6f9d08fe4f3

No products installed.
[root@testweb01d ~]#

Perfect. Also, with the saltstack repo deleted and re-created, salt-minion now installs fine again. I am about to try a re-image and will report what happens

Actions #13

Updated by Dylan Baars almost 8 years ago

Dylan Baars wrote:

Perfect. Also, with the saltstack repo deleted and re-created, salt-minion now installs fine again. I am about to try a re-image and will report what happens

and "new host" creation worked perfectly. Go figure, so it seems to be some sort of corruption to the "base content view" - couple of questions:

1. What is a base content view?
2. How does a "base content view" get created? I can't remember, but it was probably the first content view created
3. Can a base content view be changed?

It seems the base content view is linked to activation keys, so even if an activation key is associated with a different content view, because the base content view didn't have a version at the required environment path it wouldn't allow subscriptions, is this expected behaviour?

Is there anything I can provide to try and help figure out why the content view/pulp backend repository(ies) got corrupted somehow during the RC5-latest RC upgrade?

Thanks :-)
Dylan

Actions #14

Updated by Chris Duryee almost 8 years ago

Dylan,

I created https://pulp.plan.io/issues/2031 for comment #4.

Can you give a rundown of current issues? It looks like there were a couple of different problems happening, we may need to split them out into multiple redmine issues.

Actions #15

Updated by Dylan Baars almost 8 years ago

Chris Duryee wrote:

Dylan,

I created https://pulp.plan.io/issues/2031 for comment #4.

Can you give a rundown of current issues? It looks like there were a couple of different problems happening, we may need to split them out into multiple redmine issues.

Hi, sorry for the slow reply. Post blowing away the repos using pulp-admin and publishing a new version, everything has been good again. I think the primary issue was the one you've created a pulp issue for - everything else was fall out from me trying to correct things. I don't think the repo/content view was necessarily corrupted, while I can't test again, I'm 99% sure already managed VMs could still get data/install packages from the content view.

Actions #16

Updated by Chris Duryee almost 8 years ago

  • Status changed from New to Closed

Cool, sounds good. I will close this issue for now since the Pulp redmine exists, but feel free to re-open or open a new issue if you hit further problems.

Actions #17

Updated by Justin Sherrill almost 8 years ago

  • translation missing: en.field_release set to 166
Actions

Also available in: Atom PDF