Sample Entropy Calculation

Sample Entropy Calculation preview image

1 collaborator

Cosimo.leuci Cosimo Leuci (Author)

Tags

randomness measure 

Tagged by Cosimo Leuci about 1 year ago

sample entropy 

Tagged by Cosimo Leuci about 1 year ago

Visible to everyone | Changeable by the author
Model was written in NetLogo 6.4.0 • Viewed 428 times • Downloaded 27 times • Run 0 times
Download the 'Sample Entropy Calculation' modelDownload this modelEmbed this model

Do you have questions or comments about this model? Ask them here! (You'll first need to log in.)


Sample entropy calculation

WHAT IS IT?

Sample Entropy (SampEn) quantifies the unpredictability of a time series, providing a measure of its complexity. It calculates the probability that two segments of a data sequence, which are similar, will continue to be similar when extended by one additional point.

HOW IT WORKS

Calculating the sample entropy of a sequence involves several steps, including identifying repeating patterns and assessing the likelihood of their occurrence in similar subsequences. Below is an outline of the process:

  1. Define the sequence and two parameters:  m (pattern length): select a value for m to represent the length of the patterns to compare. Commonly, m is set to 2 or 3. In this app, it is fixed at 2, but it can be changed in the code.  r (tolerance): it is the maximum difference allowed between values for two patterns to be deemed similar. It's usually a fraction of the sequence's standard deviation. Here, the tolerance is always zero, meaning only identical subsequences are considered similar.
  2. Generate patterns of length m. Extract all patterns of length m from the sequence.
  3. Compare patterns. Each pair of patterns of length m is compared to determine if the difference between their elements is less than or equal to the tolerance r. The number of pattern pairs that meet this criterion is counted.
  4. Extend to m+1. Repeat the process for patterns of length m+1.
  5. Compute Sample Entropy. The sample entropy is the negative natural logarithm of the ratio of the previously counted similar patterns of length m+1 to that of length m.

HOW TO USE IT

Insert your alphanumeric sequence into the input line and then press the button COMPUTE-SAMPLE-ENTROPY.

THINGS TO NOTICE

Sample entropy increases as the complexity (unpredictability) of the signal sequence increases. If no regularities are detected in the sequence, it will not be possible to complete the calculation of sample entropy because its logarithm will tend to infinity.

THINGS TO TRY

Determine the sample entropy for a sequence consisting of a single character and a sequence with a repeating cyclic pattern of two or three characters. These findings will be contrasted with the entropy of a sequence derived from flipping a coin or rolling a dice.

EXTENDING THE MODEL

The model can be adapted to include values of r other than 0. Consequently, it would be essential to define a criterion for assessing the distance between the sequences to be compared. The Chebyshev distance could be employed, although any distance function, such as the Euclidean distance, is applicable. Specifically, for binary strings, the Hamming distance would be an appropriate choice.

CREDITS AND REFERENCES

Delgado-Bonal A, Marshak A. Approximate Entropy and Sample Entropy: A Comprehensive Tutorial. Entropy (Basel). 2019 May 28;21(6):541. doi: 10.3390/e21060541. PMID: 33267255; PMCID: PMC7515030.

Comments and Questions

Please start the discussion about this model! (You'll first need to log in.)

Click to Run Model

globals [ m                 ;; length of substrings that can be derived from the input-string
          substrings        ;; list of the substrings derived from the input-string
          substr-copy       ;; a copy of the previous list
          similar           ;; couple of similar substrings in the previous list
          sampen            ;; the sample entropy of the input-string
          index             ;; an index/pointer
         ]


;; --------- CALCULATING SAMPLE ENTROPY FOR THE INPUT-STRING SEQUENCE --------------------------------
;; ---------------------------------------------------------------------------------------------------

to compute-sample-entropy
  ca
  ifelse input-string = "" [output-print "No string to operate on"
    stop] [start-message]
  set input-string word input-string "_"
  ;; Creating patterns with m = 2
  set m 2
  set similar 0
  set substrings []
  built-substrings
  let substrings2 substrings
  output-print word "Set of patterns with m = 2       --> " substrings2
  ;; evaluate each pair of patterns
  ;; to determine if they are similar within a tolerance of 𝑟 = 0.
  set substr-copy substrings2
  calculate-distancies
  let S2 similar
  output-print word "Similar patterns of length = 2   -->  " S2
  output-print ""
    if S2 = 0 [stop-message stop]
  ;; Creating patterns with m = 3
  set m m + 1
  set index 0
  set similar 0
  set substrings []
  built-substrings
  set input-string but-last input-string
  let substrings3 substrings
  output-print word "Set of patterns with m + 1 = 3   --> " substrings3
  set substr-copy substrings3
  calculate-distancies
  let S3 similar
  output-print word "Similar patterns of length = 3   -->  " S3
  output-print ""
  if S3 = 0 [stop-message stop]
  ;; sample entropy computation
  set sampen precision (- ln (S3 / S2)) 5
  output-type word "***     SAMPLE ENTROPY = - ln " S3
  output-type word " / " S2
  output-type word " = " sampen
  output-print "     ***"
end 

to built-substrings
  set substrings lput (substring input-string index (index + m)) substrings
  set index index + 1
  if index < (length input-string - m) [built-substrings]
end 

to calculate-distancies
  set index 0
  let short.copy but-first substr-copy
  repeat length short.copy [
    if item 0 substr-copy = item index short.copy
      [set similar similar + 1]
       set index index + 1
       ]
  if length short.copy > 1 [
    set substr-copy but-first substr-copy
    calculate-distancies
  ]
  set index 0
end 

to start-message
  output-print ""
  output-print "Let's compute the sample entropy for the given sequence (with r=0 and m=2)."
  output-print ""
end 

to stop-message
  output-print "It is not possible to compute the sample entropy for this string"
  output-type "because the similar patterns of length " output-type m output-type " is null."
  if m = 2 [set input-string but-last input-string]
end 


;; Public Domain: to the extent possible under law, the author has waived
;; all copyright and related or neighboring rights to this model.

There are 2 versions of this model.

Uploaded by When Description Download
Cosimo Leuci 10 months ago rev. 0.9 Download this version
Cosimo Leuci about 1 year ago Initial upload Download this version

Attached files

File Type Description Last updated
Sample Entropy Calculation.png preview Preview for 'Sample Entropy Calculation' about 1 year ago, by Cosimo Leuci Download

This model does not have any ancestors.

This model does not have any descendants.