Skip to content
This repository has been archived by the owner on Jun 3, 2024. It is now read-only.

Commit

Permalink
Made DML.cls abstract, added private methods for upserting by ext ID
Browse files Browse the repository at this point in the history
  • Loading branch information
jongpie committed Jun 3, 2017
1 parent 24fbf29 commit b49315d
Show file tree
Hide file tree
Showing 5 changed files with 85 additions and 124 deletions.
38 changes: 30 additions & 8 deletions src/classes/DML.cls
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,13 @@
* This file is part of the Nebula Framework project, released under the MIT License. *
* See LICENSE file or go to https://github.com/jongpie/NebulaFramework for full license details. *
*************************************************************************************************/
public virtual class DML extends NebulaCore implements IDML {
public abstract class DML extends NebulaCore implements IDML {

This comment has been minimized.

Copy link
@jamessimone

jamessimone Jun 3, 2017

Collaborator

taking a look at these changes on my phone so I will probably just leave this one comment - I like the idea of DML being abstract, but in practice we have many times outside of the handlers (or, through the handlers but by DI with a separate class) injected just the DML class if no querying is necessary but the database update / insert / delete, etc call is going to be used. The whole idea of the DML calls always being mocked outside of our more complicated integration tests has been the key to our test speed success, but for clarity's sake we don't inject the full repo for an object if it was only going to be used for crud.

This comment has been minimized.

Copy link
@jongpie

jongpie Jun 3, 2017

Author Owner

Yo! Any reason you aren't creating a repo class for each sobject type that you're doing CRUD on (instead of calling DML.cls directly)? SObjectRepository is the solution to CRUD - DML.cls doesn't have query functionality (Read), so it doesn't have all of the pieces to fully solve CRUD.

I think calling DML.cls directly also makes it very confusing for when to use SObjectRepository vs just calling DML directly, and from the perspective of the overall framework, the idea is to use 1 (or more, if needed) SObjectRepository classes for each SObject type being used in an org.

I think you also lose the ability to add sobject-specific functionality to your code if you're calling DML directly. You could solve that by extending it (since it's virtual), but then you might as well just extend SObjectRepository (which has all DML methods + querying).

  • By using repo classes upfront, you can then more easily add object-specific logic (for any CRUD operation, including queries).
  • If you start by calling DML.cls directly and then later decide you need to add additional logic for a specific object type, then you have to go back and replace any calls to DML.cls with calls to a new class with object-specific logic - it seems better in the long run to start out with a repo for everything.

Even with this change though, the mocks still seem to be working (with some tweaks to reflect my other changes), so I don't think the changes would impact testing speeds.

This comment has been minimized.

Copy link
@jamessimone

jamessimone Jun 4, 2017

Collaborator

I definitely agree with the spirit of what you're saying. I am coming from a place where our factory class is responsible for object instantiation. Yes, we pretty much have a repo for all of the custom objects and standard objects we are using, but we don't have setters to allow tests to swap out the implementation of the SObjectRepository interface for ALL of the repos. We do have setters in some instances, but if only a DML operation is going to be used by an object, just having the one setter for the DML mock is handy.

As the architect you definitely have the purview to say "this is a design decision that I want to enforce." When I first started working on our new org, I definitely imagined that I would just be injecting a repository in practice any time I needed something retrieved or otherwise modified for an SObject, as you have said. Things have turned out a bit differently, even if we sometimes have had to go and change a DML injection into an actual repo. By the same token, though, I can definitely say that we have seen cases in instantiation where we've actually moved from having a repo to just having the DML class be injected. It's slightly more explicit.

I can't elaborate any further on what is only a basic instinct that I have surrounding this code smell. After having read what you've written, I can't say that I take any fault with it.

This comment has been minimized.

Copy link
@jongpie

jongpie Jun 4, 2017

Author Owner

Let me start by saying that I'm both so pumped to have these kinds of in depth technical discussions with you, but I'm also so bummed that it's happening after I've moved away from Boston.

With that said, I think I do want to stick with enforcing the use of SObjectRepository, at least for now - to give some perspective, my hope/goal is to both use Nebula for my own work, as well as for it to hopefully take off enough that other developers eventually implement it too. For the first stable release (v1.x), I want to minimize any additional classes or methods that I'm on the fence about - it's easier to add features in the future, but I think it'll be a big pain in the ass if (when) methods, classes, etc are removed in the future.

So with that in mind, I think I want to stick with my original plan of trying to enforce sobject repos for now, but I definitely get what you're saying that if only a DML operation is needed for an object, just using the DML & DML mock is definitely useful. I think once version 1.0 is out (and maybe a few more releases occur for stabilisation, new features, etc), I'd like to dive more into how unit tests, mocks, etc are handled by Nebula. Salesforce seems to be trying to improve that area too (see #51 for more info), but I want to keep things as simple as possible for now - if we go live with DML being virtual (instead of abstract), it's much harder to remove that after we're in 'stable releases' mode. We can more easily switch from abstract to virtual than the other way around.

If you're good with that for now, I'm definitely interested in revisiting in the future.

This comment has been minimized.

Copy link
@jamessimone

jamessimone Jun 5, 2017

Collaborator

Leave it as abstract - definitely easier to move to virtual rather than the other way around. And I saw your notes about the apex stubs. We will see how that shakes out, as well. I'm definitely pumped for you to be getting the prod experience with the framework, too!


private Schema.SObjectType sobjectType;

public DML(Schema.SObjectType sobjectType) {
this.sobjectType = sobjectType;
}

public virtual void insertRecords(SObject record) {
this.insertRecords(new List<SObject>{record});
Expand All @@ -21,13 +27,7 @@ public virtual class DML extends NebulaCore implements IDML {
}

public virtual void upsertRecords(SObject record) {
// Salesforce will only allow upsert calls for SObjects if a declared-type list is passed in.
// This is fine for the bulk method, where we can assume the caller is passing in an explicit list, but for a single record,
// the only way to successfully perform the upsert is to dynamically spin up a list of the SObject's type
String listType = 'List<' + record.getSObjectType() + '>';
List<SObject> castRecords = (List<SObject>)Type.forName(listType).newInstance();
castRecords.add(record);
this.upsertRecords(castRecords);
this.upsertRecords(this.castRecords(record));
}

public virtual void upsertRecords(List<SObject> records) {
Expand Down Expand Up @@ -59,4 +59,26 @@ public virtual class DML extends NebulaCore implements IDML {
if(!records.isEmpty()) Database.emptyRecycleBin(records);
}

// Not all objects will have external ID fields, so these methods are protected (instead of public)
// Any object that needs an upsert by external ID can expose these methods in their repos
protected virtual void upsertRecords(SObject record, Schema.SObjectField externalIdField) {
this.upsertRecords(this.castRecords(record), externalIdField);
}

protected virtual void upsertRecords(List<SObject> records, Schema.SObjectField externalIdField) {
Database.upsert(records, externalIdField);
}

private List<SObject> castRecords(SObject record) {
// Salesforce will only allow upsert calls for SObjects if a declared-type list is passed in.
// This is fine for the bulk method, where we can assume the caller is passing in an explicit list, but for a single record,
// the only way to successfully perform the upsert is to dynamically spin up a list of the SObject's type

String listType = 'List<' + this.sobjectType + '>';
List<SObject> castRecords = (List<SObject>)Type.forName(listType).newInstance();
castRecords.add(record);

return castRecords;
}

}
94 changes: 0 additions & 94 deletions src/classes/DML_Tests.cls

This file was deleted.

5 changes: 0 additions & 5 deletions src/classes/DML_Tests.cls-meta.xml

This file was deleted.

4 changes: 3 additions & 1 deletion src/classes/SObjectRepository.cls
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
public virtual class SObjectRepository extends DML implements ISObjectRepository, IDML {

protected IQueryBuilder Query {
get { return new QueryBuilder(sobjectType, fieldSet, sObjectFieldList); }
get { return new QueryBuilder(sobjectType, fieldSet, sobjectFieldList); }
}

private final Schema.SObjectType sobjectType;
Expand All @@ -27,6 +27,8 @@ public virtual class SObjectRepository extends DML implements ISObjectRepository
}

private SObjectRepository(Schema.SObjectType sobjectType, Schema.FieldSet fieldSet, List<Schema.SObjectField> sobjectFieldList) {
super(sobjectType);

this.currentModule = NebulaCore.Module.REPOSITORY;

this.sobjectType = sobjectType;
Expand Down
68 changes: 52 additions & 16 deletions src/classes/SObjectRepository_Tests.cls
Original file line number Diff line number Diff line change
Expand Up @@ -32,25 +32,21 @@ private class SObjectRepository_Tests {
)
.getFirstQueryResult();
}
}

public override void upsertRecords(SObject record) {
Account account = (Account)record;
upsert account;
}

public override void upsertRecords(List<SObject> recordList) {
List<Account> accountList = (List<Account>)recordList;
upsert accountList;
public with sharing class ContactRepository extends SObjectRepository {
public ContactRepository() {
super(Schema.Contact.SObjectType, new List<Schema.SObjectField>{Schema.Contact.Id, Schema.Contact.AccountId, Schema.Contact.Email, Schema.Contact.LastName});
}
}

public override void updateRecords(SObject record) {
TestingUtils.updatedRecords.add(record);
private without sharing class UserRepository extends SObjectRepository {
public UserRepository() {
super(Schema.User.SObjectType, new List<Schema.SObjectField>{Schema.User.Id, Schema.User.Name});
}
}

public with sharing class ContactRepository extends SObjectRepository {
public ContactRepository() {
super(Schema.Contact.SObjectType);
public override void upsertRecords(SObject record, Schema.SObjectField externalIdField) {
super.upsertRecords(record, externalIdField);
}
}

Expand Down Expand Up @@ -185,7 +181,7 @@ private class SObjectRepository_Tests {
static void it_should_return_account_and_contacts_with_emails_as_children_records() {
Account account = [SELECT Id FROM Account LIMIT 1];

Contact contact = createContact(Account.Id);
Contact contact = createContact(account.Id);
contact.Email = '[email protected]';
insert contact;
contact = [SELECT Id, AccountId, Email FROM Contact WHERE Id = :contact.Id];
Expand Down Expand Up @@ -258,7 +254,47 @@ private class SObjectRepository_Tests {
new SObjectRepository_Tests.AccountRepository().updateRecords(existingAccount);
Test.stopTest();

System.assert((Account)TestingUtils.updatedRecords[0] == existingAccount);
existingAccount = [SELECT Id, LastModifiedDate FROM Account LIMIT 1];
System.assert(existingAccount.LastModifiedDate > originalLastModifiedDate, existingAccount);
}

@isTest
static void it_should_upsert_a_single_new_record() {
Account newAccount = createAccount();
System.assertEquals(null, newAccount.Id);

Test.startTest();
new SObjectRepository_Tests.AccountRepository().upsertRecords(newAccount);
Test.stopTest();

System.assertNotEquals(null, newAccount.Id);
}

@isTest
static void it_should_upsert_a_single_existing_record() {
Account existingAccount = [SELECT Id, LastModifiedDate FROM Account LIMIT 1];
Datetime originalLastModifiedDate = existingAccount.LastModifiedDate;

Test.startTest();
new SObjectRepository_Tests.AccountRepository().upsertRecords(existingAccount);
Test.stopTest();

existingAccount = [SELECT Id, LastModifiedDate FROM Account LIMIT 1];
System.assert(existingAccount.LastModifiedDate > originalLastModifiedDate, existingAccount);
}

@isTest
static void it_should_upsert_a_single_existing_record_with_external_id() {
User existingUser = [SELECT Id, LastModifiedDate, Username FROM User WHERE Id = :UserInfo.getUserId()];
Datetime originalLastModifiedDate = existingUser.LastModifiedDate;
System.assertNotEquals(null, existingUser.Username);

Test.startTest();
new SObjectRepository_Tests.UserRepository().upsertRecords(existingUser, Schema.User.Username);
Test.stopTest();

existingUser = [SELECT Id, LastModifiedDate FROM User LIMIT 1];
System.assert(existingUser.LastModifiedDate > originalLastModifiedDate, existingUser);
}

@isTest
Expand Down

0 comments on commit b49315d

Please sign in to comment.