Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Pardot
Jul 25, 2001




Pollyanna posted:

One of the senior engineers on the team says that some of our service code shouldn’t actually set any default values for a given model (which needs it to build the form), and that the default values for the new model should instead be determined by whatever the form partial defines as the defaults. I really dislike the idea of our views being the ones in charge of default values, but I don’t know if I’m being too dogmatic about it. Am I overly concerned about this?

No, you are correct. That's loving weird. Defaults and constraints should be in postgres, and maybe also in the models. In the views is wrong.

Adbot
ADBOT LOVES YOU

Volguus
Mar 3, 2009

Pollyanna posted:

One of the senior engineers on the team says that some of our service code shouldn’t actually set any default values for a given model (which needs it to build the form), and that the default values for the new model should instead be determined by whatever the form partial defines as the defaults. I really dislike the idea of our views being the ones in charge of default values, but I don’t know if I’m being too dogmatic about it. Am I overly concerned about this?

While I agree that views should be as dumb as possible, you can make a compromise where, once the form is submitted, the back end sets default values for anything is missing.
I don't think this is a worthy fight to have, but mostly ... a meh, should be done the other way.

Doh004
Apr 22, 2007

Mmmmm Donuts...

Pollyanna posted:

One of the senior engineers on the team says that some of our service code shouldn’t actually set any default values for a given model (which needs it to build the form), and that the default values for the new model should instead be determined by whatever the form partial defines as the defaults. I really dislike the idea of our views being the ones in charge of default values, but I don’t know if I’m being too dogmatic about it. Am I overly concerned about this?

Have they explained why?

Chilled Milk
Jun 22, 2003

No one here is alone,
satellites in every home

Pollyanna posted:

One of the senior engineers on the team says that some of our service code shouldn’t actually set any default values for a given model (which needs it to build the form), and that the default values for the new model should instead be determined by whatever the form partial defines as the defaults. I really dislike the idea of our views being the ones in charge of default values, but I don’t know if I’m being too dogmatic about it. Am I overly concerned about this?

That's definitely weird and I'd want to know why? If the form's object is being built deeper than in the controller action I guess there could be weird side-effects. But even then I'd reach for custom constructor methods before putting that logic in the view.

Pollyanna
Mar 5, 2005

Milk's on them.


:iiam: I’m guessing I just misunderstood him or maybe that we were actually talking about putting the default values (if not present) in a FormBuilder, though I didn’t think FormBuilders could do that.

His main point is that form generation should account for the defaults such that you don’t have to hardcode any default values, but I don’t know how you don’t hardcore defaults. This is why I hate Rails forms :negative:

Trammel
Dec 31, 2007
.

Aquarium of Lies posted:

Rails 6 has parallel testing built in which we’re looking forward too :) Other devs have tried parallelizing in the last without much luck so I’m curious to see if it’ll work better with 6.

The Rails 6 parallel testing is for Test::Unit, not RSpec. There's an open issue about supporting it in rspec-rails, but "We'd love to support parallel tests with RSpec itself, its a rather large task however."

Still, the multi-database connection stuff is nice.

ddiddles
Oct 21, 2008

Roses are red, violets are blue, I'm a schizophrenic and so am I
Been bumbling my way through learning Rails, wondering if I'm setting up my tests even semi correctly.

I have a Project model that belongs to a user, and I'm testing that the controllers show route only allows the authenticated user to see their own projects, is there anything glaringly wrong with this test?

code:
describe 'GET #show' do
    context 'with unauthenticated user' do
      it 'returns unauthorized' do
        get :show, params: { id: 1 }
        expect(response).to be_unauthorized
      end
    end

    context 'with authenticated user' do
      let(:user) { create(:user) }
      let(:user2) { create(:user) }
      let(:user_project) { create(:project, { name: 'First user project', user_id: user.id }) }
      let(:user2_project) { create(:project, { name: 'Second user project', user_id: user2.id })}

      it 'returns the project if it belongs to the user' do
        authorize_header(user)
        get :show, params: { id: user_project.id }
        json = JSON.parse(response.body)

        expect(json["name"]).to eq(user_project.name)
        
        get :show, params: { id: user2_project.id }
        expect(response).to be_not_found

        authorize_header(user2)
        get :show, params: { id: user2_project.id }
        expect(JSON.parse(response.body)['name']).to eq(user2_project.name)

        get :show, params: { id: user_project }
        expect(response).to be_not_found
      end
    end
  end

Trammel
Dec 31, 2007
.

ddiddles posted:

Been bumbling my way through learning Rails, wondering if I'm setting up my tests even semi correctly.

I have a Project model that belongs to a user, and I'm testing that the controllers show route only allows the authenticated user to see their own projects, is there anything glaringly wrong with this test?

Nothing glaringly wrong, in fact, it's a very good test for a first rails app.

It often helps to keep each test to a single `expect` statement, so that each test description matches a single simple expectation, keeping the individual tests smaller and easier to reason about.

It's not obvious in the code what `authorize_header` does. It seems to set some kind of global, or similar? Effectively, it looks like a side-effect, and while rails' whole `render` within a controller is a whole nasty side effect, it's best to avoid them if possible.

There's no real need for the last 2 expectations, you've tested the expected behaviour, and in production, I don't think users can re-authorize and change identities within the lifespan of a request.

rubocop is great at giving feedback and explaining why certain "best practises" exist also.

code:
RSpec.describe ProjectController, type: :controller do
  describe 'GET #show' do
    context 'without an authenticated user' do
      it 'returns unauthorized' do
        get :show, params: { id: 1 }
        expect(response).to be_unauthorized
      end
    end

    context 'with an authenticated user' do
      let(:user) { create(:user) }
      let(:user2) { create(:user) }
      let(:user_project) { create(:project, { name: 'First user project', user_id: user.id }) }
      let(:user2_project) { create(:project, { name: 'Second user project', user_id: user2.id })}

      before do
        authorize_header(user)
      end

      it 'returns the project if it belongs to the user' do
        get :show, params: { id: user_project.id }
        json = JSON.parse(response.body)
        expect(json["name"]).to eq(user_project.name)
      end

      it "returns not-found for a project that don't belong to the user" do
        get :show, params: { id: user2_project.id }
        expect(response).to be_not_found
      end
    end
  end
end
You could move the project_id in to it's own context block also, around each of the tests, splitting out the context from describing the expected behaviour further. But it's very easy to start having ridiculously nested rspec contexts.

code:
RSpec.describe ProjectController, type: :controller do
  describe 'GET #show' do
    context 'without an authenticated user' do
      it 'returns unauthorized' do
        get :show, params: { id: 1 }
        expect(response).to be_unauthorized
      end
    end

    context 'with an authenticated user' do
      let(:user) { create(:user) }
      let(:user2) { create(:user) }
      let(:user_project) { create(:project, { name: 'First user project', user_id: user.id }) }
      let(:user2_project) { create(:project, { name: 'Second user project', user_id: user2.id }) }

      before do
        authorize_header(user)
        get :show, params: { id: project_id }
      end

      context 'with a project-id belonging to the user' do
        let(:project_id) { user_project.id }

        it 'returns the project' do
          json = JSON.parse(response.body)
          expect(json["name"]).to eq(user_project.name)
        end
      end

      context "with a project-id that doesn't belong to the user" do
        let(:project_id) { user2_project.id }

        it "returns not-found" do
          expect(response).to be_not_found
        end
      end
    end
  end
end 
Just keep your controllers as thin as possible. Writing tests is a great way to force yourself to decouple the application logic from the controllers, because the larger the controller, the harder it is to test, and slower the tests run. Small controllers in contrast have virtually no logic to test, often mixing in functionality, which can be tested separately.

necrotic
Aug 2, 2005
I owe my brother big time for this!

Trammel posted:

It often helps to keep each test to a single `expect` statement, so that each test description matches a single simple expectation, keeping the individual tests smaller and easier to reason about.

This seems like a good idea until your app and test suite grow: the setup and teardown for a bunch of tiny tests can really add up quickly. The approach you have now is completely reasonable for an integration test (which these are). For unit tests, where you aren't creating records or calling services, one expect per test is more reasonable.

Pollyanna
Mar 5, 2005

Milk's on them.


Aren’t controller specs getting deprecated?

necrotic
Aug 2, 2005
I owe my brother big time for this!
Integration tests aren't, which go through the request flow. Testing the controllers directly is indeed deprecated, test the routes.

That is specific to TestUnit, the default test harness in Rails. Dunno if RSpec is changing anything.

kayakyakr
Feb 16, 2004

Kayak is true

necrotic posted:

Integration tests aren't, which go through the request flow. Testing the controllers directly is indeed deprecated, test the routes.

That is specific to TestUnit, the default test harness in Rails. Dunno if RSpec is changing anything.

rspec broke controller tests a long, long time ago. They're still possible, but pretty hard with the newest versions. Mostly integration tests with a few model unit tests thrown in for good measure is the way to go.

necrotic
Aug 2, 2005
I owe my brother big time for this!
That is a controller test, it's being described. Route tests are integration tests, they go through the routing engine.

ddiddles
Oct 21, 2008

Roses are red, violets are blue, I'm a schizophrenic and so am I

Trammel posted:

...

Just keep your controllers as thin as possible. Writing tests is a great way to force yourself to decouple the application logic from the controllers, because the larger the controller, the harder it is to test, and slower the tests run. Small controllers in contrast have virtually no logic to test, often mixing in functionality, which can be tested separately.

This is awesome, thanks for all that info.

Going to read up on requests specs, coming from the front end world, this test environment is already 1000x better to work in.

kayakyakr
Feb 16, 2004

Kayak is true

ddiddles posted:

This is awesome, thanks for all that info.

Going to read up on requests specs, coming from the front end world, this test environment is already 1000x better to work in.

I've been really impressed with Cypress for front-end integration testing.

Trammel
Dec 31, 2007
.

necrotic posted:

This seems like a good idea until your app and test suite grow: the setup and teardown for a bunch of tiny tests can really add up quickly. The approach you have now is completely reasonable for an integration test (which these are). For unit tests, where you aren't creating records or calling services, one expect per test is more reasonable.

I'm definitely biased towards lots of unit tests, with instance doubles, with a few integration tests to prove out basic success and failure options. But the integration vs unit tests debate has been endlessly argued and I don't think the correct way is proven one way or the other. I do think it's probably easier to refactor for speed by combining tests, than refactor for clarity, later.

Then again, how many integration tests do you want in a scenario like this? There's likely to be more than one route with authentication and access limitations, and it's going to get tedious, repetitive and slow testing these limitations on every route, even using shared examples. Can we test the authentication and access control in a single place, without integration tests?

Tea Bone
Feb 18, 2011

I'm going for gasps.
I need to generate PDFs from a template PDF supplied by a client. I’m currently using an ancient version of prawn to handle this as support for template PDFs was extracted from paperclip back in 2014.

I'm aware that theres the prawn-templates gem, but this feels like a temporary solution as there was a reason template support was dropped.

I’ve looked into switching over to wicked, but it looks like I’ll need to convert the supplied templates into html? This isn’t really ideal as the client updates the template semi-regular and it would be a huge pain to to re-create the html each time.

Are there any other alternatives out there?

xtal
Jan 9, 2011

by Fluffdaddy

Tea Bone posted:

I need to generate PDFs from a template PDF supplied by a client. I’m currently using an ancient version of prawn to handle this as support for template PDFs was extracted from paperclip back in 2014.

I'm aware that theres the prawn-templates gem, but this feels like a temporary solution as there was a reason template support was dropped.

I’ve looked into switching over to wicked, but it looks like I’ll need to convert the supplied templates into html? This isn’t really ideal as the client updates the template semi-regular and it would be a huge pain to to re-create the html each time.

Are there any other alternatives out there?

Can you explain more about the problem? I think you could do that by using PDF forms (the best approach) or by overlaying a different PDF on top of it (a hacky workaround.) Parsing and generating a PDF according to a template in a way that preserves its appearance is extremely hard, and would be misusing the format, IMO.

Tea Bone
Feb 18, 2011

I'm going for gasps.

xtal posted:

Can you explain more about the problem? I think you could do that by using PDF forms (the best approach) or by overlaying a different PDF on top of it (a hacky workaround.) Parsing and generating a PDF according to a template in a way that preserves its appearance is extremely hard, and would be misusing the format, IMO.

Yeah sure, basically it's a web portal which generates a variety of pdfs for the end user to download based off of values they submit in a form. The client who owns the portal sends us a batch of template pdfs intermittently (every 3 months or so) usually with just minor copy changes, but occasionally entirely new layouts/requirements to re-order or add new fields to the generated pdf.

Ideally I want to just drop the drop the templates into a library and point the script to generate the output pdf to the template location.

At the moment I generate a new pdf with prawn and point prawn's template parameter at our template pdfs, then I draw text boxes at the required x,y coordinates to fill in the blanks.

I'm not too sure about the internal workings of prawn, but I suspect it's doing what you suggested in your second point (It just overlays my text boxes as a new PDF over the template PDF).

I agree that PDF forms are probably the way to go, but that means either us having to add the form fields to the template or getting the client to supply them with the fields in place already.

I'm starting to think perhaps generating the output PDF from another PDF is the wrong way of going about this. I'd perhaps be better off generating everything as an image then converting it to PDF? The output pdf doesn't need to be editable (or even selectable). But the templates are usually multiple pages long with only one or two of those pages requiring anything overlayed.

Cock Democracy
Jan 1, 2003

Now that is the finest piece of chilean sea bass I have ever smelled

Tea Bone posted:

I'm starting to think perhaps generating the output PDF from another PDF is the wrong way of going about this. I'd perhaps be better off generating everything as an image then converting it to PDF? The output pdf doesn't need to be editable (or even selectable). But the templates are usually multiple pages long with only one or two of those pages requiring anything overlayed.
Yeah, that seems reasonable to me. I could even see having a UI where you upload the new PDF, it gets converted to an image, and you can choose the x,y values by clicking where the text should go. Then you could be really lazy and have someone else keep it updated!

ddiddles
Oct 21, 2008

Roses are red, violets are blue, I'm a schizophrenic and so am I
Another question about rails specs. Switched over to using request specs rather than controller, but I'm repeating myself a lot checking that routes are protected by auth.

code:
require 'rails_helper'

RSpec.describe 'Projects', :type => :request do
  include ApiHelper

  let!(:valid_create_params) do
    { name: 'Test Project' }
  end

  context 'with an unauthorized create request' do
    it 'returns a 401 status' do
      post '/projects', params: valid_create_params
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized create request' do
    let(:user) { create(:user) }

    it 'returns a project belonging to the authed user' do
      post '/projects', params: valid_create_params, headers: authorize_header(user)
      project = JSON.parse(response.body)
      
      expect(project["name"]).to eq(valid_create_params[:name])
      expect(project["user_id"]).to eq(user.id)
    end
  end

  context 'with an unauthorized get request' do
    it 'returns a 401 status' do
      get '/projects'
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized get request' do
    let(:user) { create(:user) }

    it 'returns the users projects' do
      user.projects.create!(valid_create_params)
      get '/projects', headers: authorize_header(user)
      user_projects = JSON.parse(response.body)

      expect(user_projects.count).to eq(1)
    end
  end

  context 'with an unauthorized show request' do
    it 'returns a 401 status' do
      get '/projects/1'
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized show request' do
    let(:user) { create(:user) }
    let(:user2) { create(:user) }
    let(:user_project) { create(:project, valid_create_params.merge(user_id: user.id)) }
    let(:user2_project) { create(:project, valid_create_params.merge(user_id: user2.id)) }
    
    before do
      get "/projects/#{project_id}", headers: authorize_header(user)
    end
    
    context 'with a project_id belonging to the user' do
      let(:project_id) { user_project.id }
      
      it 'returns the project' do
        expect(JSON.parse(response.body)["user_id"]).to eq(user.id)
      end
    end
    
    context 'with a project_id that doesnt belong to the user' do
      let(:project_id) { user2_project.id }

      it 'returns not found' do
        expect(response).to be_not_found
      end
    end
  end

  context 'with an unauthorized delete request' do
    it 'returns 401 status' do
      delete '/projects/1'
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized delete request' do
    let(:user) { create(:user) }
    let(:user2) { create(:user) }
    let(:user_project) { create(:project, valid_create_params.merge(user_id: user.id)) }
    let(:user2_project) { create(:project, valid_create_params.merge(user_id: user2.id)) }

    before do
      delete "/projects/#{project_id}", headers: authorize_header(user)
    end

    context 'with a project belonging to the user' do
      let(:project_id) { user_project.id }

      it 'should delete authed users project if it exists' do
        expect(response).to be_ok
      end
    end

    context 'with a project belonging to another user' do
      let(:project_id) { user2_project.id }

      it 'returns not found' do
        expect(response).to be_not_found
      end
    end
  end
end
The actual before_action that requires auth on requests is in the application controller, with only a few routes that are going to skip that, such as creating a new user. Should I just have a small application_controller controller spec that checks that before_action is working correctly? Or should I be as verbose as this?

kayakyakr
Feb 16, 2004

Kayak is true

ddiddles posted:

Another question about rails specs. Switched over to using request specs rather than controller, but I'm repeating myself a lot checking that routes are protected by auth.

code:
require 'rails_helper'

RSpec.describe 'Projects', :type => :request do
  include ApiHelper

  let!(:valid_create_params) do
    { name: 'Test Project' }
  end

  context 'with an unauthorized create request' do
    it 'returns a 401 status' do
      post '/projects', params: valid_create_params
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized create request' do
    let(:user) { create(:user) }

    it 'returns a project belonging to the authed user' do
      post '/projects', params: valid_create_params, headers: authorize_header(user)
      project = JSON.parse(response.body)
      
      expect(project["name"]).to eq(valid_create_params[:name])
      expect(project["user_id"]).to eq(user.id)
    end
  end

  context 'with an unauthorized get request' do
    it 'returns a 401 status' do
      get '/projects'
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized get request' do
    let(:user) { create(:user) }

    it 'returns the users projects' do
      user.projects.create!(valid_create_params)
      get '/projects', headers: authorize_header(user)
      user_projects = JSON.parse(response.body)

      expect(user_projects.count).to eq(1)
    end
  end

  context 'with an unauthorized show request' do
    it 'returns a 401 status' do
      get '/projects/1'
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized show request' do
    let(:user) { create(:user) }
    let(:user2) { create(:user) }
    let(:user_project) { create(:project, valid_create_params.merge(user_id: user.id)) }
    let(:user2_project) { create(:project, valid_create_params.merge(user_id: user2.id)) }
    
    before do
      get "/projects/#{project_id}", headers: authorize_header(user)
    end
    
    context 'with a project_id belonging to the user' do
      let(:project_id) { user_project.id }
      
      it 'returns the project' do
        expect(JSON.parse(response.body)["user_id"]).to eq(user.id)
      end
    end
    
    context 'with a project_id that doesnt belong to the user' do
      let(:project_id) { user2_project.id }

      it 'returns not found' do
        expect(response).to be_not_found
      end
    end
  end

  context 'with an unauthorized delete request' do
    it 'returns 401 status' do
      delete '/projects/1'
      expect(response).to be_unauthorized
    end
  end

  context 'with an authorized delete request' do
    let(:user) { create(:user) }
    let(:user2) { create(:user) }
    let(:user_project) { create(:project, valid_create_params.merge(user_id: user.id)) }
    let(:user2_project) { create(:project, valid_create_params.merge(user_id: user2.id)) }

    before do
      delete "/projects/#{project_id}", headers: authorize_header(user)
    end

    context 'with a project belonging to the user' do
      let(:project_id) { user_project.id }

      it 'should delete authed users project if it exists' do
        expect(response).to be_ok
      end
    end

    context 'with a project belonging to another user' do
      let(:project_id) { user2_project.id }

      it 'returns not found' do
        expect(response).to be_not_found
      end
    end
  end
end
The actual before_action that requires auth on requests is in the application controller, with only a few routes that are going to skip that, such as creating a new user. Should I just have a small application_controller controller spec that checks that before_action is working correctly? Or should I be as verbose as this?

You could also do: https://relishapp.com/rspec/rspec-core/docs/example-groups/shared-examples

ddiddles
Oct 21, 2008

Roses are red, violets are blue, I'm a schizophrenic and so am I
Exactly what I was looking for, thanks!

GeorgieMordor
Jan 23, 2015
I'm using FactoryBot in my test cases. How do I create a related record that can be referenced multiple times in a single factory? For instance, instead of using association :owner in a :car factory, I'd create the :owner record first and then explicitly define it's properties in the :car factory. Example of what I thought might work:


code:
FactoryBot.define do
	factory :car do
		_owner = create(:owner)
		
		make { "Honda" }
		model { "Civic" }
		year { [1980...2019].sample
		owner_name { _owner.name }
		owner_age { _owner.age }
	end
end
I realized this was a problem because there could be some environment conflicts between :test and :development environments. For instance, foreign key restrictions seem to step on each others toes when I try to boot up a local server.

So, I'm either going about building my factories wrong, OR I have my environments somehow misconfigured?

kayakyakr
Feb 16, 2004

Kayak is true

GeorgieMordor posted:

I'm using FactoryBot in my test cases. How do I create a related record that can be referenced multiple times in a single factory? For instance, instead of using association :owner in a :car factory, I'd create the :owner record first and then explicitly define it's properties in the :car factory. Example of what I thought might work:


code:
FactoryBot.define do
	factory :car do
		_owner = create(:owner)
		
		make { "Honda" }
		model { "Civic" }
		year { [1980...2019].sample
		owner_name { _owner.name }
		owner_age { _owner.age }
	end
end
I realized this was a problem because there could be some environment conflicts between :test and :development environments. For instance, foreign key restrictions seem to step on each others toes when I try to boot up a local server.

So, I'm either going about building my factories wrong, OR I have my environments somehow misconfigured?

So, there are definitely ways of doing this, yes, but... why are you denormalizing your database like this? Use a join or an include to get that information from the DB, don't save (and then have to sync) it to the model.

GeorgieMordor
Jan 23, 2015

kayakyakr posted:

So, there are definitely ways of doing this, yes, but... why are you denormalizing your database like this? Use a join or an include to get that information from the DB, don't save (and then have to sync) it to the model.

Also my thoughts, though this is a legacy application I've been brought on board for so I'm at the mercy of the current functionality. At least for now.

For the sake of supporting these convoluted models in tests / factories though, you indicated some ways to do this -- what are they?

kayakyakr
Feb 16, 2004

Kayak is true

GeorgieMordor posted:

Also my thoughts, though this is a legacy application I've been brought on board for so I'm at the mercy of the current functionality. At least for now.

For the sake of supporting these convoluted models in tests / factories though, you indicated some ways to do this -- what are they?

You'd use after hooks to do this. From way back in the factorygirl days: https://thoughtbot.com/blog/aint-no-calla-back-girl

GeorgieMordor
Jan 23, 2015

kayakyakr posted:

You'd use after hooks to do this. From way back in the factorygirl days: https://thoughtbot.com/blog/aint-no-calla-back-girl

FactoryGirl, wow. See now you're making me feel old.

Thanks for the link -- I'll check that out.

The Dark Souls of Posters
Nov 4, 2011

Just Post, Kupo
You could also probably use a let!

code:
let!(:car) { FactoryBot.create(:car) }
That's assuming you've defined that specific FactoryBot.create

https://relishapp.com/rspec/rspec-core/docs/helper-methods/let-and-let

kayakyakr
Feb 16, 2004

Kayak is true

Awesome Animals posted:

You could also probably use a let!

code:
let!(:car) { FactoryBot.create(:car) }
That's assuming you've defined that specific FactoryBot.create

https://relishapp.com/rspec/rspec-core/docs/helper-methods/let-and-let

Or actually, could use factorybot's version of this, the transient: https://github.com/thoughtbot/factory_bot/blob/master/GETTING_STARTED.md#transient-attributes

The Dark Souls of Posters
Nov 4, 2011

Just Post, Kupo

Oh, neat. I’ll try this at my next opportunity to do so

GeorgieMordor
Jan 23, 2015
I'm working through a Rails update now. Pre-existing Rails 4.2.* app to 5.2.3.

The application runs against Postgres, and is utilizing some json and jsonb fields.

While testing the upgrade we learned that some of these fields are being saved as strings instead of accessible JSON, and then this will break subsequent lookups to said fields when they can't access it as hashed JSON. Further reading found this is a known thing with Rails upgrades but I'm not totally understanding where the disconnect is happening.

Trying to debug, I see that params come through the request looking like this:

code:
"json"=>"{\"widgets\":[{\"Widget3000\":\"WG9240\"}]}"
What I don't understand is where are those escape slashes being added? Post-request by Rails? By the original requester and sent over with the payload?

What's tripping me up is if I try to post the same structure in a debug tool like Paw or Postman, I get a response from Rails that it's unparsable JSON. Yet, if the original requester sends it over in that shape, the payload is saved to the database without any error responses, though it is saved incorrectly as a string.

Pollyanna
Mar 5, 2005

Milk's on them.


An engineer is proposing monkey patching in two new methods to String, for an “encrypt”/“decrypt” functionality (real basic, nothing fancy). I was under the impression is monkey patching bad, but why exactly is this the case?

DaTroof
Nov 16, 2000

CC LIMERICK CONTEST GRAND CHAMPION
There once was a poster named Troof
Who was getting quite long in the toof

Pollyanna posted:

An engineer is proposing monkey patching in two new methods to String, for an “encrypt”/“decrypt” functionality (real basic, nothing fancy). I was under the impression is monkey patching bad, but why exactly is this the case?

Overriding an existing method is heinous because it can lead to incorrect assumptions and elusive bugs. Adding new methods to an existing class isn't as bad. Given that Rails is already a quilt sewn from monkeypatches, adding a couple String methods seems okay; but my first preference would still be to encapsulate those features in a dedicated class or module, especially if the implementation requires internal state.

kayakyakr
Feb 16, 2004

Kayak is true

DaTroof posted:

Overriding an existing method is heinous because it can lead to incorrect assumptions and elusive bugs. Adding new methods to an existing class isn't as bad. Given that Rails is already a quilt sewn from monkeypatches, adding a couple String methods seems okay; but my first preference would still be to encapsulate those features in a dedicated class or module, especially if the implementation requires internal state.

Good response.

GeorgieMordor posted:

I'm working through a Rails update now. Pre-existing Rails 4.2.* app to 5.2.3.

The application runs against Postgres, and is utilizing some json and jsonb fields.

While testing the upgrade we learned that some of these fields are being saved as strings instead of accessible JSON, and then this will break subsequent lookups to said fields when they can't access it as hashed JSON. Further reading found this is a known thing with Rails upgrades but I'm not totally understanding where the disconnect is happening.

Trying to debug, I see that params come through the request looking like this:

code:
"json"=>"{\"widgets\":[{\"Widget3000\":\"WG9240\"}]}"
What I don't understand is where are those escape slashes being added? Post-request by Rails? By the original requester and sent over with the payload?

What's tripping me up is if I try to post the same structure in a debug tool like Paw or Postman, I get a response from Rails that it's unparsable JSON. Yet, if the original requester sends it over in that shape, the payload is saved to the database without any error responses, though it is saved incorrectly as a string.

I think that's coming from the original requester. That's a string being encoded and wrapped as another string. So the request is being sent like:

post('endpoint', { json: JSON.stringify(someJson) })

Right? So that happens with a lot of libraries (I think including jquery) when you don't provide a content type with the ajax post. They automatically encode things as strings and don't leave them as JSON.

Only thing I can think of with Postman is that you're actually posting unparseable JSON for one reason or another.

xtal
Jan 9, 2011

by Fluffdaddy

Pollyanna posted:

An engineer is proposing monkey patching in two new methods to String, for an “encrypt”/“decrypt” functionality (real basic, nothing fancy). I was under the impression is monkey patching bad, but why exactly is this the case?

To flip this around, why do they want to do that? It isn't any better than having a class with static methods, which is what ActiveSupport does.

necrotic
Aug 2, 2005
I owe my brother big time for this!
Yeah, I'd think you would want a class so you know it's encrypted, otherwise it'd be difficult to tell.

GeorgieMordor
Jan 23, 2015

kayakyakr posted:

Good response.


I think that's coming from the original requester. That's a string being encoded and wrapped as another string. So the request is being sent like:

post('endpoint', { json: JSON.stringify(someJson) })

Right? So that happens with a lot of libraries (I think including jquery) when you don't provide a content type with the ajax post. They automatically encode things as strings and don't leave them as JSON.

Only thing I can think of with Postman is that you're actually posting unparseable JSON for one reason or another.

The requester in this case is a hardware device running on Android, but I don’t have total familiarity with that code base so I’ll see if I can isolate how ya shaping the payload before it comes over.

What’s confusing is that the same requests coming from this device did not have this problem with Rails 4, so I’m trying to isolate whether or not it’s possible the payload was always incorrectly shaped and “just worked”, OR if it’s the Rails 5 method of interacting with json and jsonb fields needs the attention and would rectify the situation.

Like, I proved I can build a serializer for the model field that will fix the problem with a single JSON array and save it properly, for instance. Nested JSON issues have problems though.

xtal
Jan 9, 2011

by Fluffdaddy
e: incorrect post here

Adbot
ADBOT LOVES YOU

Dr. Krieger
Apr 9, 2010

DaTroof posted:

Overriding an existing method is heinous because it can lead to incorrect assumptions and elusive bugs. Adding new methods to an existing class isn't as bad. Given that Rails is already a quilt sewn from monkeypatches, adding a couple String methods seems okay; but my first preference would still be to encapsulate those features in a dedicated class or module, especially if the implementation requires internal state.

This is perfect, the only thing I'd add is that of you only need your monkey patch in a few specific places is take a look at refinements. They are very weird because of how they handle scoping but they do work well when you need an explicit extension on a base class in only a few places and allow you to easily remove them later over just patching globally of you find a cleaner solution.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply