Skip to content

Commit

Permalink
Merge branch 'hotfix-1.10.x' into hotfix-2.0.x
Browse files Browse the repository at this point in the history
  • Loading branch information
npomaroli committed Jul 26, 2023
2 parents 3d26f13 + f3abdcf commit a116c29
Show file tree
Hide file tree
Showing 14 changed files with 401 additions and 56 deletions.
7 changes: 7 additions & 0 deletions CHANGELOG.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,13 @@ include::content/docs/variables.adoc-include[]
Starting with version 2.0.0 clustering with OrientDB is no longer part of the professional support by Gentics. For high availability setups, it is recommended to use
link:https://getmesh.io/premium-features/sql-db/[Gentics Mesh SQL].

[[v2.0.2]]
== 2.0.2 (26.07.2023)

icon:check[] Core: Uniqueness checks for webroot url field values will now only be done, if those values actually change. This will improve performance of e.g. schema migrations, where the webroot url field values are likely to not change.

icon:check[] Core: An internal API for efficient loading of list field values has been added.

[[v2.0.1]]
== 2.0.1 (13.07.2023)

Expand Down
7 changes: 7 additions & 0 deletions LTS-CHANGELOG.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,13 @@ include::content/docs/variables.adoc-include[]
The LTS changelog lists releases which are only accessible via a commercial subscription.
All fixes and changes in LTS releases will be released the next minor release. Changes from LTS 1.4.x will be included in release 1.5.0.

[[v1.10.12]]
== 1.10.12 (26.07.2023)

icon:check[] Core: Uniqueness checks for webroot url field values will now only be done, if those values actually change. This will improve performance of e.g. schema migrations, where the webroot url field values are likely to not change.

icon:check[] Core: An internal API for efficient loading of list field values has been added.

[[v1.10.11]]
== 1.10.11 (12.07.2023)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
| Database revision


| *2.0.0*
| *2.0.1*
| 6d5ccff3

| *2.0.1*
Expand Down
18 changes: 9 additions & 9 deletions doc/src/main/docs/generated/tables/mesh-env.adoc-include
Original file line number Diff line number Diff line change
Expand Up @@ -83,12 +83,12 @@
| *MESH_MIGRATION_TRIGGER_INTERVAL*
| Override the migration trigger interval

| *MESH_CACHE_PATH_SIZE*
| Override the path cache size.

| *MESH_GRAPH_EXPORT_DIRECTORY*
| Override the graph database export directory.

| *MESH_CACHE_PATH_SIZE*
| Override the path cache size.

| *MESH_VERTX_EVENT_BUS_ERROR_THRESHOLD*
| Override the Vert.x eventBus error threshold in ms.

Expand Down Expand Up @@ -149,12 +149,12 @@
| *MESH_ELASTICSEARCH_USERNAME*
| Override the configured Elasticsearch connection username.

| *MESH_CLUSTER_TOPOLOGY_LOCK_TIMEOUT*
| Override the cluster topology lock timeout in ms.

| *MESH_GRAPH_BACKUP_DIRECTORY*
| Override the graph database backup directory.

| *MESH_CLUSTER_TOPOLOGY_LOCK_TIMEOUT*
| Override the cluster topology lock timeout in ms.

| *MESH_VERTX_EVENT_BUS_CHECK_INTERVAL*
| Override the Vert.x eventBus check interval in ms.

Expand Down Expand Up @@ -218,12 +218,12 @@
| *MESH_HTTP_CORS_ENABLE*
| Override the configured CORS enable flag.

| *MESH_HTTP_SSL_ENABLE*
| Override the configured https server flag.

| *MESH_GRAPH_STARTSERVER*
| Override the graph database server flag.

| *MESH_HTTP_SSL_ENABLE*
| Override the configured https server flag.

| *MESH_HTTP_VERTICLE_AMOUNT*
| Override the http verticle amount.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@
import java.util.Set;
import java.util.stream.Stream;

import org.apache.commons.lang.NotImplementedException;

import com.gentics.mesh.context.BulkActionContext;
import com.gentics.mesh.context.InternalActionContext;
import com.gentics.mesh.context.impl.DummyBulkActionContext;
Expand Down Expand Up @@ -583,7 +585,21 @@ default String getDocumentId(HibNodeFieldContainer content) {
* @param conflictI18n
* key of the message in case of conflicts
*/
void updateWebrootPathInfo(HibNodeFieldContainer content, InternalActionContext ac, String branchUuid, String conflictI18n);
default void updateWebrootPathInfo(HibNodeFieldContainer content, InternalActionContext ac, String branchUuid, String conflictI18n) {
updateWebrootPathInfo(content, ac, branchUuid, conflictI18n, true);
}

/**
* Update the property webroot path info. This will optionally also check for uniqueness conflicts of the webroot path and will throw a
* {@link Errors#conflict(String, String, String, String...)} if one found.
* @param ac
* @param branchUuid
* branch Uuid
* @param conflictI18n
* key of the message in case of conflicts
* @param checkForConflicts true to check for conflicts, false to omit the check
*/
void updateWebrootPathInfo(HibNodeFieldContainer content, InternalActionContext ac, String branchUuid, String conflictI18n, boolean checkForConflicts);

/**
* Update the property webroot path info. This will also check for uniqueness conflicts of the webroot path and will throw a
Expand All @@ -593,7 +609,18 @@ default String getDocumentId(HibNodeFieldContainer content) {
* @param conflictI18n
*/
default void updateWebrootPathInfo(HibNodeFieldContainer content, String branchUuid, String conflictI18n) {
updateWebrootPathInfo(content, null, branchUuid, conflictI18n);
updateWebrootPathInfo(content, null, branchUuid, conflictI18n, true);
}

/**
* Update the property webroot path info. This will optionally also check for uniqueness conflicts of the webroot path and will throw a
* {@link Errors#conflict(String, String, String, String...)} if one found.
* @param branchUuid
* @param conflictI18n
* @param checkForConflicts true to check for conflicts, false to omit the check
*/
default void updateWebrootPathInfo(HibNodeFieldContainer content, String branchUuid, String conflictI18n, boolean checkForConflicts) {
updateWebrootPathInfo(content, null, branchUuid, conflictI18n, checkForConflicts);
}

/**
Expand Down Expand Up @@ -1007,4 +1034,50 @@ default void purge(HibNodeFieldContainer content) {
* @return
*/
FieldMap getFieldMap(HibNodeFieldContainer fieldContainer, InternalActionContext ac, SchemaModel schema, int level, List<String> containerLanguageTags);

/**
* Whether prefetching of list field values is supported. If this returns
* <code>false</code>, the methods {@link #getBooleanListFieldValues(List)},
* {@link #getDateListFieldValues(List)}, {@link #getNumberListFieldValues(List)},
* {@link #getHtmlListFieldValues(List)} and {@link #getStringListFieldValues(List)} will
* all throw a {@link NotImplementedException} when called.
*
* @return true when prefetching list field values is supported, false if not
*/
boolean supportsPrefetchingListFieldValues();

/**
* Get the boolean list field values for the given list UUIDs
* @param listUuids list UUIDs
* @return map of list UUIDs to lists of boolean values
*/
Map<String, List<Boolean>> getBooleanListFieldValues(List<String> listUuids);

/**
* Get the date list field values for the given list UUIDs
* @param listUuids list UUIDs
* @return map of list UUIDs to lists of date field values
*/
Map<String, List<Long>> getDateListFieldValues(List<String> listUuids);

/**
* Get the number list field values for the given list UUIDs
* @param listUuids list UUIDs
* @return map of list UUIDs to lists of number field values
*/
Map<String, List<Number>> getNumberListFieldValues(List<String> listUuids);

/**
* Get the html list field values for the given list UUIDs
* @param listUuids list UUIDs
* @return map of list UUIDs to lists of html field values
*/
Map<String, List<String>> getHtmlListFieldValues(List<String> listUuids);

/**
* Get the string list field values for the given list UUIDs
* @param listUuids list UUIDs
* @return map of list UUIDs to lists of string field values
*/
Map<String, List<String>> getStringListFieldValues(List<String> listUuids);
}
Original file line number Diff line number Diff line change
Expand Up @@ -527,19 +527,19 @@ private NodeMeshEventModel createEvent(MeshEvent event, HibNodeFieldContainer co
}

@Override
default void updateWebrootPathInfo(HibNodeFieldContainer content, InternalActionContext ac, String branchUuid, String conflictI18n) {
default void updateWebrootPathInfo(HibNodeFieldContainer content, InternalActionContext ac, String branchUuid, String conflictI18n, boolean checkForConflicts) {
Set<String> urlFieldValues = getUrlFieldValues(content).collect(Collectors.toSet());
Iterator<? extends HibNodeFieldContainerEdge> it = getContainerEdges(content, DRAFT, branchUuid);
if (it.hasNext()) {
HibNodeFieldContainerEdge draftEdge = it.next();
updateWebrootPathInfo(content, ac, draftEdge, branchUuid, conflictI18n, DRAFT);
updateWebrootUrlFieldsInfo(content, draftEdge, branchUuid, urlFieldValues, DRAFT);
updateWebrootUrlFieldsInfo(content, draftEdge, branchUuid, urlFieldValues, DRAFT, checkForConflicts);
}
it = getContainerEdges(content, PUBLISHED, branchUuid);
if (it.hasNext()) {
HibNodeFieldContainerEdge publishEdge = it.next();
updateWebrootPathInfo(content, ac, publishEdge, branchUuid, conflictI18n, PUBLISHED);
updateWebrootUrlFieldsInfo(content, publishEdge, branchUuid, urlFieldValues, PUBLISHED);
updateWebrootUrlFieldsInfo(content, publishEdge, branchUuid, urlFieldValues, PUBLISHED, checkForConflicts);
}
}

Expand Down Expand Up @@ -708,28 +708,31 @@ default String composeSegmentInfo(HibNode parentNode, String segment) {
* @param branchUuid
* @param urlFieldValues
* @param type
* @param checkForConflicts true when the check for conflicting values must be done, false if not
*/
private void updateWebrootUrlFieldsInfo(HibNodeFieldContainer content, HibNodeFieldContainerEdge edge, String branchUuid, Set<String> urlFieldValues, ContainerType type) {
private void updateWebrootUrlFieldsInfo(HibNodeFieldContainer content, HibNodeFieldContainerEdge edge, String branchUuid, Set<String> urlFieldValues, ContainerType type, boolean checkForConflicts) {
if (urlFieldValues != null && !urlFieldValues.isEmpty()) {
HibNodeFieldContainerEdge conflictingEdge = getConflictingEdgeOfWebrootField(content, edge, urlFieldValues, branchUuid, type);
if (conflictingEdge != null) {
HibNodeFieldContainer conflictingContainer = conflictingEdge.getNodeContainer();
HibNode conflictingNode = conflictingEdge.getNode();
if (log.isDebugEnabled()) {
log.debug(
"Found conflicting container with uuid {" + conflictingContainer.getUuid() + "} of node {" + conflictingNode.getUuid());
if (checkForConflicts) {
HibNodeFieldContainerEdge conflictingEdge = getConflictingEdgeOfWebrootField(content, edge, urlFieldValues, branchUuid, type);
if (conflictingEdge != null) {
HibNodeFieldContainer conflictingContainer = conflictingEdge.getNodeContainer();
HibNode conflictingNode = conflictingEdge.getNode();
if (log.isDebugEnabled()) {
log.debug(
"Found conflicting container with uuid {" + conflictingContainer.getUuid() + "} of node {" + conflictingNode.getUuid());
}
// We know that the found container already occupies the index with one of the given paths. Lets compare both sets of paths in order to
// determine
// which path caused the conflict.
Set<String> fromConflictingContainer = getUrlFieldValues(conflictingContainer).collect(Collectors.toSet());
@SuppressWarnings("unchecked")
Collection<String> conflictingValues = CollectionUtils.intersection(fromConflictingContainer, urlFieldValues);
String paths = String.join(",", conflictingValues);

throw nodeConflict(conflictingNode.getUuid(), conflictingContainer.getDisplayFieldValue(), conflictingContainer.getLanguageTag(),
"node_conflicting_urlfield_update", paths, conflictingContainer.getNode().getUuid(),
conflictingContainer.getLanguageTag());
}
// We know that the found container already occupies the index with one of the given paths. Lets compare both sets of paths in order to
// determine
// which path caused the conflict.
Set<String> fromConflictingContainer = getUrlFieldValues(conflictingContainer).collect(Collectors.toSet());
@SuppressWarnings("unchecked")
Collection<String> conflictingValues = CollectionUtils.intersection(fromConflictingContainer, urlFieldValues);
String paths = String.join(",", conflictingValues);

throw nodeConflict(conflictingNode.getUuid(), conflictingContainer.getDisplayFieldValue(), conflictingContainer.getLanguageTag(),
"node_conflicting_urlfield_update", paths, conflictingContainer.getNode().getUuid(),
conflictingContainer.getLanguageTag());
}
edge.setUrlFieldInfo(urlFieldValues);
} else {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,8 @@
import java.util.stream.Collectors;
import java.util.stream.Stream;


import org.apache.commons.collections.CollectionUtils;
import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.tuple.Pair;

Expand Down Expand Up @@ -1956,13 +1958,15 @@ default void setPublished(HibNode node, InternalActionContext ac, HibNodeFieldCo
PersistingContentDao contentDao = CommonTx.get().contentDao();
String languageTag = container.getLanguageTag();
boolean isAutoPurgeEnabled = contentDao.isAutoPurgeEnabled(container);
Set<String> oldUrlFieldValues = Collections.emptySet();

// Remove an existing published edge
HibNodeFieldContainerEdge edge = contentDao.getEdge(node, languageTag, branchUuid, PUBLISHED);
if (edge != null) {
HibNodeFieldContainer oldPublishedContainer = contentDao.getFieldContainerOfEdge(edge);
oldUrlFieldValues = contentDao.getUrlFieldValues(oldPublishedContainer).collect(Collectors.toSet());
contentDao.removeEdge(edge);
contentDao.updateWebrootPathInfo(oldPublishedContainer, branchUuid, "node_conflicting_segmentfield_publish");
contentDao.updateWebrootPathInfo(oldPublishedContainer, branchUuid, "node_conflicting_segmentfield_publish", true);
if (ac.isPurgeAllowed() && isAutoPurgeEnabled && contentDao.isPurgeable(oldPublishedContainer)) {
contentDao.purge(oldPublishedContainer);
}
Expand All @@ -1976,7 +1980,11 @@ default void setPublished(HibNode node, InternalActionContext ac, HibNodeFieldCo
}
// create new published edge
contentDao.createContainerEdge(node, container, branchUuid, languageTag, PUBLISHED);
contentDao.updateWebrootPathInfo(container, branchUuid, "node_conflicting_segmentfield_publish");

// only check for conflicts, when the values are different from the values of the old container
Set<String> newUrlFieldValues = contentDao.getUrlFieldValues(container).collect(Collectors.toSet());
boolean checkForConflicts = !CollectionUtils.isEqualCollection(oldUrlFieldValues, newUrlFieldValues);
contentDao.updateWebrootPathInfo(container, branchUuid, "node_conflicting_segmentfield_publish", checkForConflicts);
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@


import org.apache.commons.collections4.CollectionUtils;
import org.apache.commons.lang.NotImplementedException;
import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.tuple.Pair;

Expand Down Expand Up @@ -360,4 +361,34 @@ public void deleteField(HibDeletableField field) {
public void setDisplayFieldValue(HibNodeFieldContainer container, String value) {
toGraph(container).property(NodeGraphFieldContainerImpl.DISPLAY_FIELD_PROPERTY_KEY, value);
}

@Override
public boolean supportsPrefetchingListFieldValues() {
return false;
}

@Override
public Map<String, List<Boolean>> getBooleanListFieldValues(List<String> listUuids) {
throw new NotImplementedException("Prefetching of list values is not implemented");
}

@Override
public Map<String, List<Long>> getDateListFieldValues(List<String> listUuids) {
throw new NotImplementedException("Prefetching of list values is not implemented");
}

@Override
public Map<String, List<Number>> getNumberListFieldValues(List<String> listUuids) {
throw new NotImplementedException("Prefetching of list values is not implemented");
}

@Override
public Map<String, List<String>> getHtmlListFieldValues(List<String> listUuids) {
throw new NotImplementedException("Prefetching of list values is not implemented");
}

@Override
public Map<String, List<String>> getStringListFieldValues(List<String> listUuids) {
throw new NotImplementedException("Prefetching of list values is not implemented");
}
}
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
package com.gentics.mesh.assertj.impl;

import static com.gentics.mesh.MeshVersion.CURRENT_API_BASE_PATH;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertNull;
Expand All @@ -9,13 +10,17 @@

import java.io.IOException;
import java.io.InputStream;
import java.util.Arrays;
import java.util.Collection;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.Scanner;
import java.util.regex.Matcher;
import java.util.regex.Pattern;

import org.apache.commons.lang3.StringUtils;
import org.assertj.core.api.AbstractAssert;
import org.jetbrains.annotations.NotNull;

Expand Down Expand Up @@ -80,8 +85,27 @@ public JsonObjectAssert hasNot(String path, String msg) {
public JsonObjectAssert has(String path, String value, String msg) {
try {
Object actualValue = getByPath(path);
String actualStringRep = String.valueOf(actualValue);
assertEquals("Value for property on path {" + path + "} did not match: " + msg, value, actualStringRep);
if (actualValue instanceof Collection<?>) {
Collection<Object> actualCollection = (Collection<Object>) actualValue;

if (StringUtils.startsWith(value, "[") && StringUtils.endsWith(value, "]")) {
value = StringUtils.removeStart(value, "[");
value = StringUtils.removeEnd(value, "]");
String[] valueParts = StringUtils.split(value, ",");
for (int i = 0; i < valueParts.length; i++) {
valueParts[i] = StringUtils.trim(valueParts[i]);
}
List<String> values = Arrays.asList(valueParts);
assertThat(actualCollection).as("Value for property on path {" + path + "}").containsOnlyElementsOf(values);
} else {
fail("Expected value for path {" + path + "} should be an array (eclosed by '[' and ']') but was {"
+ value + "}");
}

} else {
String actualStringRep = String.valueOf(actualValue);
assertEquals("Value for property on path {" + path + "} did not match: " + msg, value, actualStringRep);
}
} catch (PathNotFoundException e) {
fail("Could not find property for path {" + path + "} - Json is:\n--snip--\n" + actual.encodePrettily() + "\n--snap--\n" + msg);
}
Expand Down
Loading

0 comments on commit a116c29

Please sign in to comment.