[OUP] Withdrawn of not? Metadata vs landing page

I reported metadata vs landing page discrepancies on PubPeer:

(Sorry for altering the links, the platform would not let me post them – now edited)

2 Likes

Hi Guillaume,

Thanks for your post.

I’ve adjusted your forum permissions, so you should be able to post links in the future.

In these cases, some information seems to have been lost in the midst of an ownership transfer for the Cardiovascular Research journal. All of these items were initially registered by Elsevier, and Elsevier added the “WITHDRAWN” in front of the article titles shortly after they were registered. And then shortly after that (all within the span of a year) the journal and all its DOIs were transferred to Oxford University Press.

And, whatever caused the addition of the “WITHDRAWN” was not clearly communicated to OUP. For their part, OUP has updated the resolution URLs several times in the intervening years, and added references, but they never submitted updates for the core bibliographic metadata, including the article titles.

We can pass this feedback along to our contacts at OUP with a request to bring the metadata and landing pages in synch as to whether the content is withdrawn or not.

1 Like

Thank you Shayn for this valuable information and for offering to relay my concerns to OUP.

OUP, the current stewards of this metadata, have confirmed that the works are not in fact withdrawn. They have updated the metadata for the following DOIs to reflect that @gcabanac

10.1016/j.cardiores.2007.07.018
10.1016/j.cardiores.2007.06.024
10.1016/j.cardiores.2007.03.029
10.1016/j.cardiores.2007.07.015
10.1016/j.cardiores.2007.08.008

Warm regards,

Isaac

Thanks for this update!
I quoted your reply in a PubPeer comment for each of these DOIs.

2 Likes

Thank you very much @gcabanac for this vivid example, which is both very useful and highly instructive. This example helps raise awareness among editors, reviewers, and also developers of technological solutions. It also leads us to broaden the perspective in order to raise an important issue: the phenomenon of “tortured phrases”, encouraging us to reconsider and reflect on the limitations of automated detection tools when assessing the quality of information contained in scientific publications :face_with_monocle:.

One useful approach could be the creation of an “invariant scientific lexicon”, gathering technical terms whose meaning does not support arbitrary linguistic substitution, such as artificial intelligence, polymerase chain reaction, or neural network. These technical terms cannot be replaced by synonyms without altering their scientific meaning.

The substitution of such terms often constitutes an indicator of artificial paraphrasing. :robot:

At the same time, this case reminds us that automated tools remain limited instruments of assistance. The interpretation of anomalies, whether in metadata or in scientific texts, is a process that must always involve human judgement and editorial responsibility :brain: