Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rel 6 2 4 mergeback #4406

Merged
merged 79 commits into from
Jan 6, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
79 commits
Select commit Hold shift + click to select a range
c6c0952
jm wrong bundle entry url (#4213)
jmarchionatto Oct 28, 2022
bd28730
improved logging (#4217)
fil512 Oct 30, 2022
2ff11d8
Rel 6 1 3 mergeback (#4215)
tadgh Oct 31, 2022
92d7126
pin okio-jvm for kotlin vuln (#4216)
tadgh Oct 31, 2022
6a657d4
Fix UrlUtil.unescape() by not escaping "+" to " " if this is an "appl…
lukedegruchy Oct 31, 2022
a018360
Ks 20221031 migration lock (#4224)
fil512 Nov 1, 2022
f68d905
4207-getpagesoffset-set-to-total-number-of-resources-results-in-incon…
Nov 2, 2022
928ec86
Fix bug with MDM submit
tadgh Nov 3, 2022
eb739e4
fix
tadgh Nov 3, 2022
220bfeb
Version bump
tadgh Nov 3, 2022
3199728
4234 consent in conjunction with versionedapiconverterinterceptor fai…
tadgh Nov 3, 2022
b965347
Allow Batch2 transition from ERRORED to COMPLETE (#4242)
jamesagnew Nov 4, 2022
045ceb3
3685 When bulk exporting, if no resource type param is provided, defa…
KGJ-software Nov 4, 2022
e6b80d3
Add next version
tadgh Nov 4, 2022
a296299
bulk export permanently reusing cached results (#4249)
tadgh Nov 4, 2022
590ddf1
Fix broken test
tadgh Nov 7, 2022
52aa09f
Smile 4892 DocumentReference Attachment url (#4237)
nathandoef Nov 8, 2022
2be9e89
Overlapping SearchParameter with the same code and base are not allow…
Qingyixia Nov 10, 2022
15c74a4
ignore misfires in quartz
tadgh Nov 10, 2022
5e02ea1
Allowing Failures On Index Drops (#4272)
epeartree Nov 10, 2022
9cc8be5
Revert "ignore misfires in quartz"
tadgh Nov 11, 2022
7f67493
Ignore misfires in quartz (#4273)
tadgh Nov 11, 2022
1722812
Reindex Behaviour Issues (#4261)
nathandoef Nov 11, 2022
75cadad
Set official Version
tadgh Nov 11, 2022
cc51bc2
license
tadgh Nov 11, 2022
1bd060e
Fix up numbers
tadgh Nov 11, 2022
434d817
Fix up numbers
tadgh Nov 11, 2022
a2d01fc
Update numbers
tadgh Nov 11, 2022
642afed
wip
tadgh Nov 11, 2022
b9dfd43
fix numbers
tadgh Nov 11, 2022
d08a7a9
Fix test:
tadgh Nov 11, 2022
93ab52c
Fix more tests
tadgh Nov 12, 2022
cef36bd
TEMP FIX FOR BUILD
tadgh Nov 12, 2022
3f46aa1
wip
tadgh Nov 12, 2022
c6e7381
Updating version to: 6.2.1 post release.
markiantorno Nov 12, 2022
2a455c3
Add a whack of logging
tadgh Nov 14, 2022
a2ea7ff
wip
tadgh Nov 15, 2022
138be3d
add implementation
tadgh Nov 15, 2022
c19d17b
wip and test
tadgh Nov 15, 2022
8c896fc
wip
tadgh Nov 15, 2022
92b9270
last-second-fetch
tadgh Nov 15, 2022
483b88b
expose useful method
tadgh Nov 15, 2022
4d37441
remove 10000 limit
Nov 15, 2022
59558db
Strip some logging
tadgh Nov 15, 2022
1a373c8
Fix up logging
tadgh Nov 15, 2022
779867a
Unpublicize method
tadgh Nov 15, 2022
a75ac9b
Fix version
tadgh Nov 15, 2022
e244e7a
Make minor changes
tadgh Nov 15, 2022
15c328d
once again on 6.2.1
tadgh Nov 15, 2022
ea66e37
re-add version enum
tadgh Nov 15, 2022
d2a5eb3
add folder
tadgh Nov 15, 2022
969266b
fix test
tadgh Nov 15, 2022
c25bc8e
DIsable busted test
tadgh Nov 15, 2022
4b2ede2
Disable more broken tests
tadgh Nov 16, 2022
5e67399
Only submit queued chunks
tadgh Nov 16, 2022
df7e3df
Quiet log
tadgh Nov 16, 2022
71d8f80
Fix wrong pinned version
tadgh Nov 16, 2022
2606c4a
Updating version to: 6.2.2 post release.
markiantorno Nov 17, 2022
dddac97
fixes for https://github.com/hapifhir/hapi-fhir/issues/4277 and https…
jkiddo Nov 18, 2022
7b5d303
backport and changelog for 6.2.2
Nov 25, 2022
2cf06e5
Updating version to: 6.2.3 post release.
markiantorno Dec 1, 2022
2ee0cf9
fix https://simpaticois.atlassian.net/browse/SMILE-5781
Dec 14, 2022
1c15ac9
Version bump to 6.2.3-SNAPSHOT
jamesagnew Dec 16, 2022
d60c62c
Merge branch 'rel_6_2' of github.com:jamesagnew/hapi-fhir into rel_6_2
jamesagnew Dec 16, 2022
b8ca063
Auto retry on MDM Clear conflicts (#4398)
jamesagnew Jan 2, 2023
02a4d0a
Update to 6.2.3 again
tadgh Jan 3, 2023
85b8817
Update license dates
tadgh Jan 3, 2023
2dad377
Dont fail on batch2 double delivery (#4400)
jamesagnew Jan 4, 2023
3ba279b
Update docker for release ppipeline
tadgh Jan 4, 2023
9d46faa
Updating version to: 6.2.4 post release.
markiantorno Jan 4, 2023
d578e09
Add test and implementation to fix potential NPE in pre-show resource…
tadgh Dec 22, 2022
1e5993a
Fix up megeback
tadgh Jan 5, 2023
77bf387
update backport info
tadgh Jan 5, 2023
285043d
update backport info
tadgh Jan 5, 2023
81a577c
Updating version to: 6.2.5 post release.
markiantorno Jan 5, 2023
3ea70eb
Merge branch 'rel_6_2' into rel_6_2_4_mergeback
tadgh Jan 5, 2023
ac9abb0
please
tadgh Jan 6, 2023
017650a
Merge branch 'master' into rel_6_2_4_mergeback
tadgh Jan 6, 2023
f38895c
fix test
tadgh Jan 6, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,10 @@ public enum VersionEnum {
V6_1_4,
V6_2_0,
V6_2_1,
V6_2_2,
V6_2_3,
V6_2_4,
V6_2_5,
// Dev Build
V6_3_0,
V6_4_0
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
This version fixes a bug with 6.2.0 and previous releases wherein batch jobs that created very large chunk counts could occasionally fail to submit a small proportion of chunks.
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
release-date: "2022-11-25"
codename: "Vishwa"
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
release-date: "2023-01-05"
codename: "Vishwa"
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
release-date: "2023-01-04"
codename: "Vishwa"
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
---
type: add
issue: 4291
backport: 6.2.2
title: "The NPM package installer did not support installing on R4B repositories. Thanks to Jens Kristian Villadsen
for the pull request!"
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,5 @@
type: fix
issue: 4388
jira: SMILE-5834
backport: 6.2.4
title: "Fixed an edge case during a Read operation where hooks could be invoked with a null resource. This could cause a NullPointerException in some cases."
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
type: fix
backport: 6.2.3
title: "The $mdm-clear operation sometimes failed with a constraint error when running in a heavily
multithreaded environment. This has been fixed."
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
type: fix
issue: 4400
title: "When Batch2 work notifications are received twice (e.g. because the notification engine double delivered)
an unrecoverable failure could occur. This has been corrected."
Original file line number Diff line number Diff line change
Expand Up @@ -156,6 +156,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B

public static final String BASE_RESOURCE_NAME = "resource";
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(BaseHapiFhirResourceDao.class);

@Autowired
protected PlatformTransactionManager myPlatformTransactionManager;
@Autowired(required = false)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.entity.MdmLink;
import ca.uhn.fhir.jpa.entity.PartitionEntity;
import ca.uhn.fhir.jpa.interceptor.UserRequestRetryVersionConflictsInterceptor;
import ca.uhn.fhir.jpa.mdm.provider.BaseLinkR4Test;
import ca.uhn.fhir.jpa.partition.IRequestPartitionHelperSvc;
import ca.uhn.fhir.rest.api.server.SystemRequestDetails;
Expand All @@ -14,8 +15,10 @@
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
import ca.uhn.fhir.mdm.api.MdmQuerySearchParameters;
import ca.uhn.fhir.mdm.api.paging.MdmPageRequest;
import ca.uhn.fhir.mdm.batch2.clear.MdmClearStep;
import ca.uhn.fhir.mdm.model.MdmTransactionContext;
import ca.uhn.fhir.mdm.rules.config.MdmSettings;
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
import ca.uhn.fhir.rest.server.interceptor.partition.RequestTenantPartitionInterceptor;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import org.hl7.fhir.instance.model.api.IBaseParameters;
Expand All @@ -37,6 +40,7 @@
import java.math.BigDecimal;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.atomic.AtomicBoolean;

import static ca.uhn.fhir.mdm.provider.MdmProviderDstu3Plus.DEFAULT_PAGE_SIZE;
import static ca.uhn.fhir.mdm.provider.MdmProviderDstu3Plus.MAX_PAGE_SIZE;
Expand All @@ -59,6 +63,7 @@ public class MdmControllerSvcImplTest extends BaseLinkR4Test {
private Batch2JobHelper myBatch2JobHelper;
@Autowired
private MdmSettings myMdmSettings;
private UserRequestRetryVersionConflictsInterceptor myUserRequestRetryVersionConflictsInterceptor;
private final RequestTenantPartitionInterceptor myPartitionInterceptor = new RequestTenantPartitionInterceptor();

@Override
Expand All @@ -70,12 +75,16 @@ public void before() throws Exception {
myPartitionLookupSvc.createPartition(new PartitionEntity().setId(2).setName(PARTITION_2), null);
myInterceptorService.registerInterceptor(myPartitionInterceptor);
myMdmSettings.setEnabled(true);

myUserRequestRetryVersionConflictsInterceptor = new UserRequestRetryVersionConflictsInterceptor();
myInterceptorService.registerInterceptor(myUserRequestRetryVersionConflictsInterceptor);
}

@Override
@AfterEach
public void after() throws IOException {
myMdmSettings.setEnabled(false);
myInterceptorService.unregisterInterceptor(myUserRequestRetryVersionConflictsInterceptor);
myPartitionSettings.setPartitioningEnabled(false);
myInterceptorService.unregisterInterceptor(myPartitionInterceptor);
super.after();
Expand Down Expand Up @@ -160,6 +169,35 @@ public void testMdmClearWithProvidedResources() {
assertLinkCount(2);
}

@Test
public void testMdmClearWithWriteConflict() {
AtomicBoolean haveFired = new AtomicBoolean(false);
MdmClearStep.setClearCompletionCallbackForUnitTest(()->{
if (haveFired.getAndSet(true) == false) {
throw new ResourceVersionConflictException("Conflict");
}
});

assertLinkCount(1);

RequestPartitionId requestPartitionId1 = RequestPartitionId.fromPartitionId(1);
RequestPartitionId requestPartitionId2 = RequestPartitionId.fromPartitionId(2);
createPractitionerAndUpdateLinksOnPartition(buildJanePractitioner(), requestPartitionId1);
createPractitionerAndUpdateLinksOnPartition(buildJanePractitioner(), requestPartitionId2);
assertLinkCount(3);

List<String> urls = new ArrayList<>();
urls.add("Practitioner");
IPrimitiveType<BigDecimal> batchSize = new DecimalType(new BigDecimal(100));
ServletRequestDetails details = new ServletRequestDetails();
details.setTenantId(PARTITION_2);
IBaseParameters clearJob = myMdmControllerSvc.submitMdmClearJob(urls, batchSize, details);
String jobId = ((StringType) ((Parameters) clearJob).getParameterValue("jobId")).getValueAsString();
myBatch2JobHelper.awaitJobCompletion(jobId);

assertLinkCount(2);
}

private class PartitionIdMatcher implements ArgumentMatcher<RequestPartitionId> {
private RequestPartitionId myRequestPartitionId;

Expand Down
21 changes: 1 addition & 20 deletions hapi-fhir-jpaserver-mdm/src/test/resources/logback-test.xml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>TRACE</level>
<level>INFO</level>
</filter>
<encoder>
<!--N.B use this pattern to remove timestamp/thread/level/logger information from logs during testing.<pattern>[%file:%line] %msg%n</pattern>-->
Expand Down Expand Up @@ -49,25 +49,6 @@
<appender-ref ref="STDOUT" />
</logger>

<!--
Configuration for MDM troubleshooting log
-->
<appender name="MDM_TROUBLESHOOTING" class="ch.qos.logback.core.rolling.RollingFileAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter"><level>DEBUG</level></filter>
<file>${smile.basedir}/log/mdm-troubleshooting.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>${smile.basedir}/log/mdm-troubleshooting.log.%i.gz</fileNamePattern>
<minIndex>1</minIndex>
<maxIndex>9</maxIndex>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>5MB</maxFileSize>
</triggeringPolicy>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>

<logger name="ca.uhn.fhir.log.mdm_troubleshooting" level="TRACE">
<appender-ref ref="MDM_TROUBLESHOOTING"/>
</logger>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,8 @@
import javax.persistence.EntityManager;
import java.util.List;

import java.util.List;

import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.fail;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -166,6 +166,7 @@
import org.springframework.transaction.support.TransactionTemplate;

import javax.annotation.Nonnull;
import javax.sql.DataSource;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
import ca.uhn.fhir.batch2.jobs.models.BatchResourceId;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.api.dao.DaoRegistry;
import ca.uhn.fhir.jpa.bulk.export.api.IBulkExportProcessor;
import ca.uhn.fhir.jpa.bulk.export.model.ExportPIDIteratorParameters;
import ca.uhn.fhir.rest.api.server.storage.IResourcePersistentId;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,10 @@

import javax.annotation.Nullable;

import java.util.Optional;

import static org.apache.commons.lang3.StringUtils.isBlank;

public class WorkChunkProcessor {
private static final Logger ourLog = Logs.getBatchTroubleshootingLog();

Expand Down Expand Up @@ -113,7 +117,12 @@ public WorkChunkProcessor(IJobPersistence theJobPersistence,
} else {
// all other kinds of steps
Validate.notNull(theWorkChunk);
StepExecutionDetails<PT, IT> stepExecutionDetails = getExecutionDetailsForNonReductionStep(theWorkChunk, theInstance, inputType, parameters);
Optional<StepExecutionDetails<PT, IT>> stepExecutionDetailsOpt = getExecutionDetailsForNonReductionStep(theWorkChunk, theInstance, inputType, parameters);
if (!stepExecutionDetailsOpt.isPresent()) {
return new JobStepExecutorOutput<>(false, dataSink);
}

StepExecutionDetails<PT, IT> stepExecutionDetails = stepExecutionDetailsOpt.get();

// execute the step
boolean success = myStepExecutor.executeStep(stepExecutionDetails, worker, dataSink);
Expand Down Expand Up @@ -146,7 +155,7 @@ protected <PT extends IModelJson, IT extends IModelJson, OT extends IModelJson>
/**
* Construct execution details for non-reduction step
*/
private <PT extends IModelJson, IT extends IModelJson> StepExecutionDetails<PT, IT> getExecutionDetailsForNonReductionStep(
private <PT extends IModelJson, IT extends IModelJson> Optional<StepExecutionDetails<PT, IT>> getExecutionDetailsForNonReductionStep(
WorkChunk theWorkChunk,
JobInstance theInstance,
Class<IT> theInputType,
Expand All @@ -155,11 +164,15 @@ private <PT extends IModelJson, IT extends IModelJson> StepExecutionDetails<PT,
IT inputData = null;

if (!theInputType.equals(VoidModel.class)) {
if (isBlank(theWorkChunk.getData())) {
ourLog.info("Ignoring chunk[{}] for step[{}] in status[{}] because it has no data", theWorkChunk.getId(), theWorkChunk.getTargetStepId(), theWorkChunk.getStatus());
return Optional.empty();
}
inputData = theWorkChunk.getData(theInputType);
}

String chunkId = theWorkChunk.getId();

return new StepExecutionDetails<>(theParameters, inputData, theInstance, chunkId);
return Optional.of(new StepExecutionDetails<>(theParameters, inputData, theInstance, chunkId));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -454,6 +454,31 @@ public void testPerformStep_ChunkNotKnown() {

}

/**
* If a notification is received for a chunk that should have data but doesn't, we can just ignore that
* (just caused by double delivery of a chunk notification message)
*/
@Test
public void testPerformStep_ChunkAlreadyComplete() {

// Setup

WorkChunk chunk = createWorkChunkStep2();
chunk.setData((String)null);
setupMocks(createJobDefinition(), chunk);
mySvc.start();

// Execute

myWorkChannelReceiver.send(new JobWorkNotificationJsonMessage(createWorkNotification(STEP_2)));

// Verify
verifyNoMoreInteractions(myStep1Worker);
verifyNoMoreInteractions(myStep2Worker);
verifyNoMoreInteractions(myStep3Worker);

}

@Test
public void testStartInstance() {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@
import ca.uhn.fhir.rest.api.server.storage.TransactionDetails;
import ca.uhn.fhir.rest.server.provider.ProviderConstants;
import ca.uhn.fhir.util.StopWatch;
import com.google.common.annotations.VisibleForTesting;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
Expand All @@ -54,6 +55,7 @@
public class MdmClearStep implements IJobStepWorker<MdmClearJobParameters, ResourceIdListWorkChunkJson, VoidModel> {

private static final Logger ourLog = LoggerFactory.getLogger(MdmClearStep.class);
private static Runnable ourClearCompletionCallbackForUnitTest;

@Autowired
HapiTransactionService myHapiTransactionService;
Expand All @@ -71,6 +73,8 @@ public class MdmClearStep implements IJobStepWorker<MdmClearJobParameters, Resou
public RunOutcome run(@Nonnull StepExecutionDetails<MdmClearJobParameters, ResourceIdListWorkChunkJson> theStepExecutionDetails, @Nonnull IJobDataSink<VoidModel> theDataSink) throws JobExecutionFailedException {

SystemRequestDetails requestDetails = new SystemRequestDetails();
requestDetails.setRetry(true);
requestDetails.setMaxRetries(100);
requestDetails.setRequestPartitionId(theStepExecutionDetails.getParameters().getRequestPartitionId());
TransactionDetails transactionDetails = new TransactionDetails();
myHapiTransactionService.execute(requestDetails, transactionDetails, buildJob(requestDetails, transactionDetails, theStepExecutionDetails));
Expand Down Expand Up @@ -119,7 +123,18 @@ public Void doInTransaction(@Nonnull TransactionStatus theStatus) {

ourLog.info("Finished removing {} golden resources in {} - {}/sec - Instance[{}] Chunk[{}]", persistentIds.size(), sw, sw.formatThroughput(persistentIds.size(), TimeUnit.SECONDS), myInstanceId, myChunkId);

if (ourClearCompletionCallbackForUnitTest != null) {
ourClearCompletionCallbackForUnitTest.run();
}

return null;
}
}


@VisibleForTesting
public static void setClearCompletionCallbackForUnitTest(Runnable theClearCompletionCallbackForUnitTest) {
ourClearCompletionCallbackForUnitTest = theClearCompletionCallbackForUnitTest;
}

}
6 changes: 3 additions & 3 deletions src/checkstyle/checkstyle.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@
<property name="charset" value="UTF-8"/>
<property name="cacheFile" value="target/cache_non_main_files"/>

<module name="SuppressionFilter">
<property name="file" value="src/checkstyle/checkstyle_suppressions.xml" />
</module>
<!-- <module name="SuppressionFilter">-->
<!-- <property name="file" value="${basedir}/src/checkstyle/checkstyle_suppressions.xml" />-->
<!-- </module> TODO GGG propagate this to master -->
<module name="TreeWalker">
<module name="RegexpSinglelineJava">
<property name="format" value="System\.out\.println"/>
Expand Down