Welcome to mirror list, hosted at ThFree Co, Russian Federation.

git.blender.org/blender.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorTamito Kajiyama <rd6t-kjym@asahi-net.or.jp>2013-11-24 04:28:01 +0400
committerTamito Kajiyama <rd6t-kjym@asahi-net.or.jp>2014-01-28 18:33:57 +0400
commit6498b96ce7081db039354228213d72e8c70bd3aa (patch)
tree17aa60d6dcec97497df8c349b6023f35cd06d78d /release/scripts/freestyle
parentdf471369e26b08232000763b2128d6f58a916f82 (diff)
Reorganized the Freestyle Python API in a hierarchical package structure.
Both C- and Python-coded API components were rearranged into logical groups. New Python modules are packaged as follows: freestyle - Top-level package freestyle.types - Classes for core data structues (e.g., view map) freestyle.chainingiterators - Pre-defined chaining iterators freestyle.functions - Pre-defined 0D and 1D functions freestyle.predicates - Pre-defined 0D and 1D predicates freestyle.shaders - Pre-defined stroke shaders freestyle.utils - Utility functions The Python modules are installed in scripts/freestyle/modules. Pre-defined styles are installed in scripts/freestyle/styles. To-do: update styles according to the new Freestyle API package structure.
Diffstat (limited to 'release/scripts/freestyle')
-rw-r--r--release/scripts/freestyle/modules/freestyle/__init__.py21
-rw-r--r--release/scripts/freestyle/modules/freestyle/chainingiterators.py716
-rw-r--r--release/scripts/freestyle/modules/freestyle/functions.py197
-rw-r--r--release/scripts/freestyle/modules/freestyle/predicates.py516
-rw-r--r--release/scripts/freestyle/modules/freestyle/shaders.py1249
-rw-r--r--release/scripts/freestyle/modules/freestyle/types.py81
-rw-r--r--release/scripts/freestyle/modules/freestyle/utils.py24
-rw-r--r--release/scripts/freestyle/modules/parameter_editor.py (renamed from release/scripts/freestyle/style_modules/parameter_editor.py)2
-rw-r--r--release/scripts/freestyle/style_modules/ChainingIterators.py713
-rw-r--r--release/scripts/freestyle/style_modules/Functions0D.py105
-rw-r--r--release/scripts/freestyle/style_modules/Functions1D.py58
-rw-r--r--release/scripts/freestyle/style_modules/PredicatesB1D.py73
-rw-r--r--release/scripts/freestyle/style_modules/PredicatesU0D.py96
-rw-r--r--release/scripts/freestyle/style_modules/PredicatesU1D.py342
-rw-r--r--release/scripts/freestyle/style_modules/logical_operators.py47
-rw-r--r--release/scripts/freestyle/style_modules/shaders.py1227
-rw-r--r--release/scripts/freestyle/styles/anisotropic_diffusion.py (renamed from release/scripts/freestyle/style_modules/anisotropic_diffusion.py)0
-rw-r--r--release/scripts/freestyle/styles/apriori_and_causal_density.py (renamed from release/scripts/freestyle/style_modules/apriori_and_causal_density.py)0
-rw-r--r--release/scripts/freestyle/styles/apriori_density.py (renamed from release/scripts/freestyle/style_modules/apriori_density.py)0
-rw-r--r--release/scripts/freestyle/styles/backbone_stretcher.py (renamed from release/scripts/freestyle/style_modules/backbone_stretcher.py)0
-rw-r--r--release/scripts/freestyle/styles/blueprint_circles.py (renamed from release/scripts/freestyle/style_modules/blueprint_circles.py)0
-rw-r--r--release/scripts/freestyle/styles/blueprint_ellipses.py (renamed from release/scripts/freestyle/style_modules/blueprint_ellipses.py)0
-rw-r--r--release/scripts/freestyle/styles/blueprint_squares.py (renamed from release/scripts/freestyle/style_modules/blueprint_squares.py)0
-rw-r--r--release/scripts/freestyle/styles/cartoon.py (renamed from release/scripts/freestyle/style_modules/cartoon.py)0
-rw-r--r--release/scripts/freestyle/styles/contour.py (renamed from release/scripts/freestyle/style_modules/contour.py)0
-rw-r--r--release/scripts/freestyle/styles/curvature2d.py (renamed from release/scripts/freestyle/style_modules/curvature2d.py)0
-rw-r--r--release/scripts/freestyle/styles/external_contour.py (renamed from release/scripts/freestyle/style_modules/external_contour.py)0
-rw-r--r--release/scripts/freestyle/styles/external_contour_sketchy.py (renamed from release/scripts/freestyle/style_modules/external_contour_sketchy.py)0
-rw-r--r--release/scripts/freestyle/styles/external_contour_smooth.py (renamed from release/scripts/freestyle/style_modules/external_contour_smooth.py)0
-rw-r--r--release/scripts/freestyle/styles/haloing.py (renamed from release/scripts/freestyle/style_modules/haloing.py)0
-rw-r--r--release/scripts/freestyle/styles/ignore_small_occlusions.py (renamed from release/scripts/freestyle/style_modules/ignore_small_occlusions.py)0
-rw-r--r--release/scripts/freestyle/styles/invisible_lines.py (renamed from release/scripts/freestyle/style_modules/invisible_lines.py)0
-rw-r--r--release/scripts/freestyle/styles/japanese_bigbrush.py (renamed from release/scripts/freestyle/style_modules/japanese_bigbrush.py)0
-rw-r--r--release/scripts/freestyle/styles/long_anisotropically_dense.py (renamed from release/scripts/freestyle/style_modules/long_anisotropically_dense.py)0
-rw-r--r--release/scripts/freestyle/styles/multiple_parameterization.py (renamed from release/scripts/freestyle/style_modules/multiple_parameterization.py)0
-rw-r--r--release/scripts/freestyle/styles/nature.py (renamed from release/scripts/freestyle/style_modules/nature.py)0
-rw-r--r--release/scripts/freestyle/styles/near_lines.py (renamed from release/scripts/freestyle/style_modules/near_lines.py)0
-rw-r--r--release/scripts/freestyle/styles/occluded_by_specific_object.py (renamed from release/scripts/freestyle/style_modules/occluded_by_specific_object.py)0
-rw-r--r--release/scripts/freestyle/styles/polygonalize.py (renamed from release/scripts/freestyle/style_modules/polygonalize.py)0
-rw-r--r--release/scripts/freestyle/styles/qi0.py (renamed from release/scripts/freestyle/style_modules/qi0.py)0
-rw-r--r--release/scripts/freestyle/styles/qi0_not_external_contour.py (renamed from release/scripts/freestyle/style_modules/qi0_not_external_contour.py)0
-rw-r--r--release/scripts/freestyle/styles/qi1.py (renamed from release/scripts/freestyle/style_modules/qi1.py)0
-rw-r--r--release/scripts/freestyle/styles/qi2.py (renamed from release/scripts/freestyle/style_modules/qi2.py)0
-rw-r--r--release/scripts/freestyle/styles/sequentialsplit_sketchy.py (renamed from release/scripts/freestyle/style_modules/sequentialsplit_sketchy.py)0
-rw-r--r--release/scripts/freestyle/styles/sketchy_multiple_parameterization.py (renamed from release/scripts/freestyle/style_modules/sketchy_multiple_parameterization.py)0
-rw-r--r--release/scripts/freestyle/styles/sketchy_topology_broken.py (renamed from release/scripts/freestyle/style_modules/sketchy_topology_broken.py)0
-rw-r--r--release/scripts/freestyle/styles/sketchy_topology_preserved.py (renamed from release/scripts/freestyle/style_modules/sketchy_topology_preserved.py)0
-rw-r--r--release/scripts/freestyle/styles/split_at_highest_2d_curvatures.py (renamed from release/scripts/freestyle/style_modules/split_at_highest_2d_curvatures.py)0
-rw-r--r--release/scripts/freestyle/styles/split_at_tvertices.py (renamed from release/scripts/freestyle/style_modules/split_at_tvertices.py)0
-rw-r--r--release/scripts/freestyle/styles/stroke_texture.py (renamed from release/scripts/freestyle/style_modules/stroke_texture.py)0
-rw-r--r--release/scripts/freestyle/styles/suggestive.py (renamed from release/scripts/freestyle/style_modules/suggestive.py)0
-rw-r--r--release/scripts/freestyle/styles/thickness_fof_depth_discontinuity.py (renamed from release/scripts/freestyle/style_modules/thickness_fof_depth_discontinuity.py)0
-rw-r--r--release/scripts/freestyle/styles/tipremover.py (renamed from release/scripts/freestyle/style_modules/tipremover.py)0
-rw-r--r--release/scripts/freestyle/styles/tvertex_remover.py (renamed from release/scripts/freestyle/style_modules/tvertex_remover.py)0
-rw-r--r--release/scripts/freestyle/styles/uniformpruning_zsort.py (renamed from release/scripts/freestyle/style_modules/uniformpruning_zsort.py)0
55 files changed, 2806 insertions, 2661 deletions
diff --git a/release/scripts/freestyle/modules/freestyle/__init__.py b/release/scripts/freestyle/modules/freestyle/__init__.py
new file mode 100644
index 00000000000..284b96ee875
--- /dev/null
+++ b/release/scripts/freestyle/modules/freestyle/__init__.py
@@ -0,0 +1,21 @@
+# ##### BEGIN GPL LICENSE BLOCK #####
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+#
+# ##### END GPL LICENSE BLOCK #####
+
+# module members
+from _freestyle import Operators
+from . import chainingiterators, functions, predicates, types, utils
diff --git a/release/scripts/freestyle/modules/freestyle/chainingiterators.py b/release/scripts/freestyle/modules/freestyle/chainingiterators.py
new file mode 100644
index 00000000000..1e8c1927432
--- /dev/null
+++ b/release/scripts/freestyle/modules/freestyle/chainingiterators.py
@@ -0,0 +1,716 @@
+# ##### BEGIN GPL LICENSE BLOCK #####
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+#
+# ##### END GPL LICENSE BLOCK #####
+
+# module members
+from _freestyle import (
+ ChainPredicateIterator,
+ ChainSilhouetteIterator,
+ )
+
+# modules for implementing chaining iterators
+from freestyle.types import (
+ AdjacencyIterator,
+ ChainingIterator,
+ )
+from freestyle.utils import ContextFunctions as CF
+import bpy
+
+## the natural chaining iterator
+## It follows the edges of same nature following the topology of
+## objects with preseance on silhouettes, then borders,
+## then suggestive contours, then everything else. It doesn't chain the same ViewEdge twice
+## You can specify whether to stay in the selection or not.
+class pyChainSilhouetteIterator(ChainingIterator):
+ def __init__(self, stayInSelection=True):
+ ChainingIterator.__init__(self, stayInSelection, True, None, True)
+ def init(self):
+ pass
+ def traverse(self, iter):
+ winner = None
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == mateVE.id:
+ winner = ve
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for i in range(len(natures)):
+ currentNature = self.current_edge.nature
+ if (natures[i] & currentNature) != 0:
+ count=0
+ while not it.is_end:
+ visitNext = 0
+ oNature = it.object.nature
+ if (oNature & natures[i]) != 0:
+ if natures[i] != oNature:
+ for j in range(i):
+ if (natures[j] & oNature) != 0:
+ visitNext = 1
+ break
+ if visitNext != 0:
+ break
+ count = count+1
+ winner = it.object
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ return winner
+
+## the natural chaining iterator
+## It follows the edges of same nature on the same
+## objects with preseance on silhouettes, then borders,
+## then suggestive contours, then everything else. It doesn't chain the same ViewEdge twice
+## You can specify whether to stay in the selection or not.
+## You can specify whether to chain iterate over edges that were
+## already visited or not.
+class pyChainSilhouetteGenericIterator(ChainingIterator):
+ def __init__(self, stayInSelection=True, stayInUnvisited=True):
+ ChainingIterator.__init__(self, stayInSelection, stayInUnvisited, None, True)
+ def init(self):
+ pass
+ def traverse(self, iter):
+ winner = None
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == mateVE.id:
+ winner = ve
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for i in range(len(natures)):
+ currentNature = self.current_edge.nature
+ if (natures[i] & currentNature) != 0:
+ count=0
+ while not it.is_end:
+ visitNext = 0
+ oNature = it.object.nature
+ ve = it.object
+ if ve.id == self.current_edge.id:
+ it.increment()
+ continue
+ if (oNature & natures[i]) != 0:
+ if natures[i] != oNature:
+ for j in range(i):
+ if (natures[j] & oNature) != 0:
+ visitNext = 1
+ break
+ if visitNext != 0:
+ break
+ count = count+1
+ winner = ve
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ return winner
+
+class pyExternalContourChainingIterator(ChainingIterator):
+ def __init__(self):
+ ChainingIterator.__init__(self, False, True, None, True)
+ self._isExternalContour = ExternalContourUP1D()
+ def init(self):
+ self._nEdges = 0
+ self._isInSelection = 1
+ def checkViewEdge(self, ve, orientation):
+ if orientation != 0:
+ vertex = ve.second_svertex()
+ else:
+ vertex = ve.first_svertex()
+ it = AdjacencyIterator(vertex,1,1)
+ while not it.is_end:
+ ave = it.object
+ if self._isExternalContour(ave):
+ return 1
+ it.increment()
+ print("pyExternlContourChainingIterator : didn't find next edge")
+ return 0
+ def traverse(self, iter):
+ winner = None
+ it = AdjacencyIterator(iter)
+ while not it.is_end:
+ ve = it.object
+ if self._isExternalContour(ve):
+ if ve.time_stamp == CF.get_time_stamp():
+ winner = ve
+ it.increment()
+
+ self._nEdges = self._nEdges+1
+ if winner is None:
+ orient = 1
+ it = AdjacencyIterator(iter)
+ while not it.is_end:
+ ve = it.object
+ if it.is_incoming:
+ orient = 0
+ good = self.checkViewEdge(ve,orient)
+ if good != 0:
+ winner = ve
+ it.increment()
+ return winner
+
+## the natural chaining iterator
+## with a sketchy multiple touch
+class pySketchyChainSilhouetteIterator(ChainingIterator):
+ def __init__(self, nRounds=3,stayInSelection=True):
+ ChainingIterator.__init__(self, stayInSelection, False, None, True)
+ self._timeStamp = CF.get_time_stamp()+nRounds
+ self._nRounds = nRounds
+ def init(self):
+ self._timeStamp = CF.get_time_stamp()+self._nRounds
+ def traverse(self, iter):
+ winner = None
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == mateVE.id:
+ winner = ve
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for i in range(len(natures)):
+ currentNature = self.current_edge.nature
+ if (natures[i] & currentNature) != 0:
+ count=0
+ while not it.is_end:
+ visitNext = 0
+ oNature = it.object.nature
+ ve = it.object
+ if ve.id == self.current_edge.id:
+ it.increment()
+ continue
+ if (oNature & natures[i]) != 0:
+ if (natures[i] != oNature) != 0:
+ for j in range(i):
+ if (natures[j] & oNature) != 0:
+ visitNext = 1
+ break
+ if visitNext != 0:
+ break
+ count = count+1
+ winner = ve
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ if winner is None:
+ winner = self.current_edge
+ if winner.chaining_time_stamp == self._timeStamp:
+ winner = None
+ return winner
+
+
+# Chaining iterator designed for sketchy style.
+# can chain several times the same ViewEdge
+# in order to produce multiple strokes per ViewEdge.
+class pySketchyChainingIterator(ChainingIterator):
+ def __init__(self, nRounds=3, stayInSelection=True):
+ ChainingIterator.__init__(self, stayInSelection, False, None, True)
+ self._timeStamp = CF.get_time_stamp()+nRounds
+ self._nRounds = nRounds
+ def init(self):
+ self._timeStamp = CF.get_time_stamp()+self._nRounds
+ def traverse(self, iter):
+ winner = None
+ found = False
+ it = AdjacencyIterator(iter)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == self.current_edge.id:
+ found = True
+ it.increment()
+ continue
+ winner = ve
+ it.increment()
+ if not found:
+ # This is a fatal error condition: self.current_edge must be found
+ # among the edges seen by the AdjacencyIterator [bug #35695].
+ if bpy.app.debug_freestyle:
+ print('pySketchyChainingIterator: current edge not found')
+ return None
+ if winner is None:
+ winner = self.current_edge
+ if winner.chaining_time_stamp == self._timeStamp:
+ return None
+ return winner
+
+
+## Chaining iterator that fills small occlusions
+## percent
+## The max length of the occluded part
+## expressed in % of the total chain length
+class pyFillOcclusionsRelativeChainingIterator(ChainingIterator):
+ def __init__(self, percent):
+ ChainingIterator.__init__(self, False, True, None, True)
+ self._length = 0
+ self._percent = float(percent)
+ def init(self):
+ # each time we're evaluating a chain length
+ # we try to do it once. Thus we reinit
+ # the chain length here:
+ self._length = 0
+ def traverse(self, iter):
+ winner = None
+ winnerOrientation = 0
+ #print(self.current_edge.id.first, self.current_edge.id.second)
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == mateVE.id:
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for nat in natures:
+ if (self.current_edge.nature & nat) != 0:
+ count=0
+ while not it.is_end:
+ ve = it.object
+ if (ve.nature & nat) != 0:
+ count = count+1
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ if winner is not None:
+ # check whether this edge was part of the selection
+ if winner.time_stamp != CF.get_time_stamp():
+ #print("---", winner.id.first, winner.id.second)
+ # if not, let's check whether it's short enough with
+ # respect to the chain made without staying in the selection
+ #------------------------------------------------------------
+ # Did we compute the prospective chain length already ?
+ if self._length == 0:
+ #if not, let's do it
+ _it = pyChainSilhouetteGenericIterator(0,0)
+ _it.begin = winner
+ _it.current_edge = winner
+ _it.orientation = winnerOrientation
+ _it.init()
+ while not _it.is_end:
+ ve = _it.object
+ #print("--------", ve.id.first, ve.id.second)
+ self._length = self._length + ve.length_2d
+ _it.increment()
+ if _it.is_begin:
+ break;
+ _it.begin = winner
+ _it.current_edge = winner
+ _it.orientation = winnerOrientation
+ if not _it.is_begin:
+ _it.decrement()
+ while (not _it.is_end) and (not _it.is_begin):
+ ve = _it.object
+ #print("--------", ve.id.first, ve.id.second)
+ self._length = self._length + ve.length_2d
+ _it.decrement()
+
+ # let's do the comparison:
+ # nw let's compute the length of this connex non selected part:
+ connexl = 0
+ _cit = pyChainSilhouetteGenericIterator(0,0)
+ _cit.begin = winner
+ _cit.current_edge = winner
+ _cit.orientation = winnerOrientation
+ _cit.init()
+ while _cit.is_end == 0 and _cit.object.time_stamp != CF.get_time_stamp():
+ ve = _cit.object
+ #print("-------- --------", ve.id.first, ve.id.second)
+ connexl = connexl + ve.length_2d
+ _cit.increment()
+ if connexl > self._percent * self._length:
+ winner = None
+ return winner
+
+## Chaining iterator that fills small occlusions
+## size
+## The max length of the occluded part
+## expressed in pixels
+class pyFillOcclusionsAbsoluteChainingIterator(ChainingIterator):
+ def __init__(self, length):
+ ChainingIterator.__init__(self, False, True, None, True)
+ self._length = float(length)
+ def init(self):
+ pass
+ def traverse(self, iter):
+ winner = None
+ winnerOrientation = 0
+ #print(self.current_edge.id.first, self.current_edge.id.second)
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == mateVE.id:
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for nat in natures:
+ if (self.current_edge.nature & nat) != 0:
+ count=0
+ while not it.is_end:
+ ve = it.object
+ if (ve.nature & nat) != 0:
+ count = count+1
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ if winner is not None:
+ # check whether this edge was part of the selection
+ if winner.time_stamp != CF.get_time_stamp():
+ #print("---", winner.id.first, winner.id.second)
+ # nw let's compute the length of this connex non selected part:
+ connexl = 0
+ _cit = pyChainSilhouetteGenericIterator(0,0)
+ _cit.begin = winner
+ _cit.current_edge = winner
+ _cit.orientation = winnerOrientation
+ _cit.init()
+ while _cit.is_end == 0 and _cit.object.time_stamp != CF.get_time_stamp():
+ ve = _cit.object
+ #print("-------- --------", ve.id.first, ve.id.second)
+ connexl = connexl + ve.length_2d
+ _cit.increment()
+ if connexl > self._length:
+ winner = None
+ return winner
+
+
+## Chaining iterator that fills small occlusions
+## percent
+## The max length of the occluded part
+## expressed in % of the total chain length
+class pyFillOcclusionsAbsoluteAndRelativeChainingIterator(ChainingIterator):
+ def __init__(self, percent, l):
+ ChainingIterator.__init__(self, False, True, None, True)
+ self._length = 0
+ self._absLength = l
+ self._percent = float(percent)
+ def init(self):
+ # each time we're evaluating a chain length
+ # we try to do it once. Thus we reinit
+ # the chain length here:
+ self._length = 0
+ def traverse(self, iter):
+ winner = None
+ winnerOrientation = 0
+ #print(self.current_edge.id.first, self.current_edge.id.second)
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == mateVE.id:
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for nat in natures:
+ if (self.current_edge.nature & nat) != 0:
+ count=0
+ while not it.is_end:
+ ve = it.object
+ if (ve.nature & nat) != 0:
+ count = count+1
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ if winner is not None:
+ # check whether this edge was part of the selection
+ if winner.time_stamp != CF.get_time_stamp():
+ #print("---", winner.id.first, winner.id.second)
+ # if not, let's check whether it's short enough with
+ # respect to the chain made without staying in the selection
+ #------------------------------------------------------------
+ # Did we compute the prospective chain length already ?
+ if self._length == 0:
+ #if not, let's do it
+ _it = pyChainSilhouetteGenericIterator(0,0)
+ _it.begin = winner
+ _it.current_edge = winner
+ _it.orientation = winnerOrientation
+ _it.init()
+ while not _it.is_end:
+ ve = _it.object
+ #print("--------", ve.id.first, ve.id.second)
+ self._length = self._length + ve.length_2d
+ _it.increment()
+ if _it.is_begin:
+ break;
+ _it.begin = winner
+ _it.current_edge = winner
+ _it.orientation = winnerOrientation
+ if not _it.is_begin:
+ _it.decrement()
+ while (not _it.is_end) and (not _it.is_begin):
+ ve = _it.object
+ #print("--------", ve.id.first, ve.id.second)
+ self._length = self._length + ve.length_2d
+ _it.decrement()
+
+ # let's do the comparison:
+ # nw let's compute the length of this connex non selected part:
+ connexl = 0
+ _cit = pyChainSilhouetteGenericIterator(0,0)
+ _cit.begin = winner
+ _cit.current_edge = winner
+ _cit.orientation = winnerOrientation
+ _cit.init()
+ while _cit.is_end == 0 and _cit.object.time_stamp != CF.get_time_stamp():
+ ve = _cit.object
+ #print("-------- --------", ve.id.first, ve.id.second)
+ connexl = connexl + ve.length_2d
+ _cit.increment()
+ if (connexl > self._percent * self._length) or (connexl > self._absLength):
+ winner = None
+ return winner
+
+## Chaining iterator that fills small occlusions without caring about the
+## actual selection
+## percent
+## The max length of the occluded part
+## expressed in % of the total chain length
+class pyFillQi0AbsoluteAndRelativeChainingIterator(ChainingIterator):
+ def __init__(self, percent, l):
+ ChainingIterator.__init__(self, False, True, None, True)
+ self._length = 0
+ self._absLength = l
+ self._percent = float(percent)
+ def init(self):
+ # each time we're evaluating a chain length
+ # we try to do it once. Thus we reinit
+ # the chain length here:
+ self._length = 0
+ def traverse(self, iter):
+ winner = None
+ winnerOrientation = 0
+ #print(self.current_edge.id.first, self.current_edge.id.second)
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ if ve.id == mateVE.id:
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for nat in natures:
+ if (self.current_edge.nature & nat) != 0:
+ count=0
+ while not it.is_end:
+ ve = it.object
+ if (ve.nature & nat) != 0:
+ count = count+1
+ winner = ve
+ if not it.is_incoming:
+ winnerOrientation = 1
+ else:
+ winnerOrientation = 0
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ if winner is not None:
+ # check whether this edge was part of the selection
+ if winner.qi != 0:
+ #print("---", winner.id.first, winner.id.second)
+ # if not, let's check whether it's short enough with
+ # respect to the chain made without staying in the selection
+ #------------------------------------------------------------
+ # Did we compute the prospective chain length already ?
+ if self._length == 0:
+ #if not, let's do it
+ _it = pyChainSilhouetteGenericIterator(0,0)
+ _it.begin = winner
+ _it.current_edge = winner
+ _it.orientation = winnerOrientation
+ _it.init()
+ while not _it.is_end:
+ ve = _it.object
+ #print("--------", ve.id.first, ve.id.second)
+ self._length = self._length + ve.length_2d
+ _it.increment()
+ if _it.is_begin:
+ break;
+ _it.begin = winner
+ _it.current_edge = winner
+ _it.orientation = winnerOrientation
+ if not _it.is_begin:
+ _it.decrement()
+ while (not _it.is_end) and (not _it.is_begin):
+ ve = _it.object
+ #print("--------", ve.id.first, ve.id.second)
+ self._length = self._length + ve.length_2d
+ _it.decrement()
+
+ # let's do the comparison:
+ # nw let's compute the length of this connex non selected part:
+ connexl = 0
+ _cit = pyChainSilhouetteGenericIterator(0,0)
+ _cit.begin = winner
+ _cit.current_edge = winner
+ _cit.orientation = winnerOrientation
+ _cit.init()
+ while not _cit.is_end and _cit.object.qi != 0:
+ ve = _cit.object
+ #print("-------- --------", ve.id.first, ve.id.second)
+ connexl = connexl + ve.length_2d
+ _cit.increment()
+ if (connexl > self._percent * self._length) or (connexl > self._absLength):
+ winner = None
+ return winner
+
+
+## the natural chaining iterator
+## It follows the edges of same nature on the same
+## objects with preseance on silhouettes, then borders,
+## then suggestive contours, then everything else. It doesn't chain the same ViewEdge twice
+## You can specify whether to stay in the selection or not.
+class pyNoIdChainSilhouetteIterator(ChainingIterator):
+ def __init__(self, stayInSelection=True):
+ ChainingIterator.__init__(self, stayInSelection, True, None, True)
+ def init(self):
+ pass
+ def traverse(self, iter):
+ winner = None
+ it = AdjacencyIterator(iter)
+ tvertex = self.next_vertex
+ if type(tvertex) is TVertex:
+ mateVE = tvertex.get_mate(self.current_edge)
+ while not it.is_end:
+ ve = it.object
+ feB = self.current_edge.last_fedge
+ feA = ve.first_fedge
+ vB = feB.second_svertex
+ vA = feA.first_svertex
+ if vA.id.first == vB.id.first:
+ winner = ve
+ break
+ feA = self.current_edge.first_fedge
+ feB = ve.last_fedge
+ vB = feB.second_svertex
+ vA = feA.first_svertex
+ if vA.id.first == vB.id.first:
+ winner = ve
+ break
+ feA = self.current_edge.last_fedge
+ feB = ve.last_fedge
+ vB = feB.second_svertex
+ vA = feA.second_svertex
+ if vA.id.first == vB.id.first:
+ winner = ve
+ break
+ feA = self.current_edge.first_fedge
+ feB = ve.first_fedge
+ vB = feB.first_svertex
+ vA = feA.first_svertex
+ if vA.id.first == vB.id.first:
+ winner = ve
+ break
+ it.increment()
+ else:
+ ## case of NonTVertex
+ natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
+ for i in range(len(natures)):
+ currentNature = self.current_edge.nature
+ if (natures[i] & currentNature) != 0:
+ count=0
+ while not it.is_end:
+ visitNext = 0
+ oNature = it.object.nature
+ if (oNature & natures[i]) != 0:
+ if natures[i] != oNature:
+ for j in range(i):
+ if (natures[j] & oNature) != 0:
+ visitNext = 1
+ break
+ if visitNext != 0:
+ break
+ count = count+1
+ winner = it.object
+ it.increment()
+ if count != 1:
+ winner = None
+ break
+ return winner
diff --git a/release/scripts/freestyle/modules/freestyle/functions.py b/release/scripts/freestyle/modules/freestyle/functions.py
new file mode 100644
index 00000000000..911030c3094
--- /dev/null
+++ b/release/scripts/freestyle/modules/freestyle/functions.py
@@ -0,0 +1,197 @@
+# ##### BEGIN GPL LICENSE BLOCK #####
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+#
+# ##### END GPL LICENSE BLOCK #####
+
+# module members
+from _freestyle import (
+ ChainingTimeStampF1D,
+ Curvature2DAngleF0D,
+ Curvature2DAngleF1D,
+ CurveNatureF0D,
+ CurveNatureF1D,
+ DensityF0D,
+ DensityF1D,
+ GetCompleteViewMapDensityF1D,
+ GetCurvilinearAbscissaF0D,
+ GetDirectionalViewMapDensityF1D,
+ GetOccludeeF0D,
+ GetOccludeeF1D,
+ GetOccludersF0D,
+ GetOccludersF1D,
+ GetParameterF0D,
+ GetProjectedXF0D,
+ GetProjectedXF1D,
+ GetProjectedYF0D,
+ GetProjectedYF1D,
+ GetProjectedZF0D,
+ GetProjectedZF1D,
+ GetShapeF0D,
+ GetShapeF1D,
+ GetSteerableViewMapDensityF1D,
+ GetViewMapGradientNormF0D,
+ GetViewMapGradientNormF1D,
+ GetXF0D,
+ GetXF1D,
+ GetYF0D,
+ GetYF1D,
+ GetZF0D,
+ GetZF1D,
+ IncrementChainingTimeStampF1D,
+ LocalAverageDepthF0D,
+ LocalAverageDepthF1D,
+ MaterialF0D,
+ Normal2DF0D,
+ Normal2DF1D,
+ Orientation2DF1D,
+ Orientation3DF1D,
+ QuantitativeInvisibilityF0D,
+ QuantitativeInvisibilityF1D,
+ ReadCompleteViewMapPixelF0D,
+ ReadMapPixelF0D,
+ ReadSteerableViewMapPixelF0D,
+ ShapeIdF0D,
+ TimeStampF1D,
+ VertexOrientation2DF0D,
+ VertexOrientation3DF0D,
+ ZDiscontinuityF0D,
+ ZDiscontinuityF1D,
+ )
+
+# modules for implementing functions
+from freestyle.types import (
+ IntegrationType,
+ UnaryFunction0DDouble,
+ UnaryFunction0DMaterial,
+ UnaryFunction0DVec2f,
+ UnaryFunction1DDouble,
+ )
+from freestyle.utils import ContextFunctions as CF
+import math
+import mathutils
+
+## Functions for 0D elements (vertices)
+#######################################
+
+class CurveMaterialF0D(UnaryFunction0DMaterial):
+ # A replacement of the built-in MaterialF0D for stroke creation.
+ # MaterialF0D does not work with Curves and Strokes.
+ def __call__(self, inter):
+ cp = inter.object
+ assert(isinstance(cp, CurvePoint))
+ fe = cp.first_svertex.get_fedge(cp.second_svertex)
+ assert(fe is not None)
+ return fe.material if fe.is_smooth else fe.material_left
+
+class pyInverseCurvature2DAngleF0D(UnaryFunction0DDouble):
+ def __call__(self, inter):
+ func = Curvature2DAngleF0D()
+ c = func(inter)
+ return (3.1415 - c)
+
+class pyCurvilinearLengthF0D(UnaryFunction0DDouble):
+ def __call__(self, inter):
+ cp = inter.object
+ assert(isinstance(cp, CurvePoint))
+ return cp.t2d
+
+## estimate anisotropy of density
+class pyDensityAnisotropyF0D(UnaryFunction0DDouble):
+ def __init__(self,level):
+ UnaryFunction0DDouble.__init__(self)
+ self.IsoDensity = ReadCompleteViewMapPixelF0D(level)
+ self.d0Density = ReadSteerableViewMapPixelF0D(0, level)
+ self.d1Density = ReadSteerableViewMapPixelF0D(1, level)
+ self.d2Density = ReadSteerableViewMapPixelF0D(2, level)
+ self.d3Density = ReadSteerableViewMapPixelF0D(3, level)
+ def __call__(self, inter):
+ c_iso = self.IsoDensity(inter)
+ c_0 = self.d0Density(inter)
+ c_1 = self.d1Density(inter)
+ c_2 = self.d2Density(inter)
+ c_3 = self.d3Density(inter)
+ cMax = max(max(c_0,c_1), max(c_2,c_3))
+ cMin = min(min(c_0,c_1), min(c_2,c_3))
+ if c_iso == 0:
+ v = 0
+ else:
+ v = (cMax-cMin)/c_iso
+ return v
+
+## Returns the gradient vector for a pixel
+## l
+## the level at which one wants to compute the gradient
+class pyViewMapGradientVectorF0D(UnaryFunction0DVec2f):
+ def __init__(self, l):
+ UnaryFunction0DVec2f.__init__(self)
+ self._l = l
+ self._step = math.pow(2,self._l)
+ def __call__(self, iter):
+ p = iter.object.point_2d
+ gx = CF.read_complete_view_map_pixel(self._l, int(p.x+self._step), int(p.y)) - \
+ CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
+ gy = CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y+self._step)) - \
+ CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
+ return mathutils.Vector([gx, gy])
+
+class pyViewMapGradientNormF0D(UnaryFunction0DDouble):
+ def __init__(self, l):
+ UnaryFunction0DDouble.__init__(self)
+ self._l = l
+ self._step = math.pow(2,self._l)
+ def __call__(self, iter):
+ p = iter.object.point_2d
+ gx = CF.read_complete_view_map_pixel(self._l, int(p.x+self._step), int(p.y)) - \
+ CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
+ gy = CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y+self._step)) - \
+ CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
+ grad = mathutils.Vector([gx, gy])
+ return grad.length
+
+## Functions for 1D elements (curves)
+#####################################
+
+class pyGetInverseProjectedZF1D(UnaryFunction1DDouble):
+ def __call__(self, inter):
+ func = GetProjectedZF1D()
+ z = func(inter)
+ return (1.0 - z)
+
+class pyGetSquareInverseProjectedZF1D(UnaryFunction1DDouble):
+ def __call__(self, inter):
+ func = GetProjectedZF1D()
+ z = func(inter)
+ return (1.0 - z*z)
+
+class pyDensityAnisotropyF1D(UnaryFunction1DDouble):
+ def __init__(self,level, integrationType=IntegrationType.MEAN, sampling=2.0):
+ UnaryFunction1DDouble.__init__(self, integrationType)
+ self._func = pyDensityAnisotropyF0D(level)
+ self._integration = integrationType
+ self._sampling = sampling
+ def __call__(self, inter):
+ v = integrate(self._func, inter.pointsBegin(self._sampling), inter.pointsEnd(self._sampling), self._integration)
+ return v
+
+class pyViewMapGradientNormF1D(UnaryFunction1DDouble):
+ def __init__(self,l, integrationType, sampling=2.0):
+ UnaryFunction1DDouble.__init__(self, integrationType)
+ self._func = pyViewMapGradientNormF0D(l)
+ self._integration = integrationType
+ self._sampling = sampling
+ def __call__(self, inter):
+ v = integrate(self._func, inter.pointsBegin(self._sampling), inter.pointsEnd(self._sampling), self._integration)
+ return v
diff --git a/release/scripts/freestyle/modules/freestyle/predicates.py b/release/scripts/freestyle/modules/freestyle/predicates.py
new file mode 100644
index 00000000000..ee6006c0ed4
--- /dev/null
+++ b/release/scripts/freestyle/modules/freestyle/predicates.py
@@ -0,0 +1,516 @@
+# ##### BEGIN GPL LICENSE BLOCK #####
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+#
+# ##### END GPL LICENSE BLOCK #####
+
+# module members
+from _freestyle import (
+ ContourUP1D,
+ DensityLowerThanUP1D,
+ EqualToChainingTimeStampUP1D,
+ EqualToTimeStampUP1D,
+ ExternalContourUP1D,
+ FalseBP1D,
+ FalseUP0D,
+ FalseUP1D,
+ Length2DBP1D,
+ QuantitativeInvisibilityUP1D,
+ SameShapeIdBP1D,
+ ShapeUP1D,
+ TrueBP1D,
+ TrueUP0D,
+ TrueUP1D,
+ ViewMapGradientNormBP1D,
+ WithinImageBoundaryUP1D,
+ )
+
+# modules for implementing predicates
+from freestyle.types import (
+ IntegrationType,
+ BinaryPredicate1D,
+ UnaryPredicate0D,
+ UnaryPredicate1D,
+ )
+from freestyle.functions import (
+ pyCurvilinearLengthF0D,
+ pyDensityAnisotropyF1D,
+ pyViewMapGradientNormF1D,
+ )
+import random
+
+## Unary predicates for 0D elements (vertices)
+##############################################
+
+class pyHigherCurvature2DAngleUP0D(UnaryPredicate0D):
+ def __init__(self,a):
+ UnaryPredicate0D.__init__(self)
+ self._a = a
+ def __call__(self, inter):
+ func = Curvature2DAngleF0D()
+ a = func(inter)
+ return (a > self._a)
+
+class pyUEqualsUP0D(UnaryPredicate0D):
+ def __init__(self,u, w):
+ UnaryPredicate0D.__init__(self)
+ self._u = u
+ self._w = w
+ def __call__(self, inter):
+ func = pyCurvilinearLengthF0D()
+ u = func(inter)
+ return (u > (self._u-self._w)) and (u < (self._u+self._w))
+
+class pyVertexNatureUP0D(UnaryPredicate0D):
+ def __init__(self,nature):
+ UnaryPredicate0D.__init__(self)
+ self._nature = nature
+ def __call__(self, inter):
+ v = inter.object
+ return (v.nature & self._nature) != 0
+
+## check whether an Interface0DIterator
+## is a TVertex and is the one that is
+## hidden (inferred from the context)
+class pyBackTVertexUP0D(UnaryPredicate0D):
+ def __init__(self):
+ UnaryPredicate0D.__init__(self)
+ self._getQI = QuantitativeInvisibilityF0D()
+ def __call__(self, iter):
+ if (iter.object.nature & Nature.T_VERTEX) == 0:
+ return 0
+ if iter.is_end:
+ return 0
+ if self._getQI(iter) != 0:
+ return 1
+ return 0
+
+class pyParameterUP0DGoodOne(UnaryPredicate0D):
+ def __init__(self,pmin,pmax):
+ UnaryPredicate0D.__init__(self)
+ self._m = pmin
+ self._M = pmax
+ #self.getCurvilinearAbscissa = GetCurvilinearAbscissaF0D()
+ def __call__(self, inter):
+ #s = self.getCurvilinearAbscissa(inter)
+ u = inter.u
+ #print(u)
+ return ((u>=self._m) and (u<=self._M))
+
+class pyParameterUP0D(UnaryPredicate0D):
+ def __init__(self,pmin,pmax):
+ UnaryPredicate0D.__init__(self)
+ self._m = pmin
+ self._M = pmax
+ #self.getCurvilinearAbscissa = GetCurvilinearAbscissaF0D()
+ def __call__(self, inter):
+ func = Curvature2DAngleF0D()
+ c = func(inter)
+ b1 = (c>0.1)
+ #s = self.getCurvilinearAbscissa(inter)
+ u = inter.u
+ #print(u)
+ b = ((u>=self._m) and (u<=self._M))
+ return b and b1
+
+## Unary predicates for 1D elements (curves)
+############################################
+
+class AndUP1D(UnaryPredicate1D):
+ def __init__(self, pred1, pred2):
+ UnaryPredicate1D.__init__(self)
+ self.__pred1 = pred1
+ self.__pred2 = pred2
+ def __call__(self, inter):
+ return self.__pred1(inter) and self.__pred2(inter)
+
+class OrUP1D(UnaryPredicate1D):
+ def __init__(self, pred1, pred2):
+ UnaryPredicate1D.__init__(self)
+ self.__pred1 = pred1
+ self.__pred2 = pred2
+ def __call__(self, inter):
+ return self.__pred1(inter) or self.__pred2(inter)
+
+class NotUP1D(UnaryPredicate1D):
+ def __init__(self, pred):
+ UnaryPredicate1D.__init__(self)
+ self.__pred = pred
+ def __call__(self, inter):
+ return not self.__pred(inter)
+
+class pyNFirstUP1D(UnaryPredicate1D):
+ def __init__(self, n):
+ UnaryPredicate1D.__init__(self)
+ self.__n = n
+ self.__count = 0
+ def __call__(self, inter):
+ self.__count = self.__count + 1
+ if self.__count <= self.__n:
+ return 1
+ return 0
+
+class pyHigherLengthUP1D(UnaryPredicate1D):
+ def __init__(self,l):
+ UnaryPredicate1D.__init__(self)
+ self._l = l
+ def __call__(self, inter):
+ return (inter.length_2d > self._l)
+
+class pyNatureUP1D(UnaryPredicate1D):
+ def __init__(self,nature):
+ UnaryPredicate1D.__init__(self)
+ self._nature = nature
+ self._getNature = CurveNatureF1D()
+ def __call__(self, inter):
+ if(self._getNature(inter) & self._nature):
+ return 1
+ return 0
+
+class pyHigherNumberOfTurnsUP1D(UnaryPredicate1D):
+ def __init__(self,n,a):
+ UnaryPredicate1D.__init__(self)
+ self._n = n
+ self._a = a
+ def __call__(self, inter):
+ count = 0
+ func = Curvature2DAngleF0D()
+ it = inter.vertices_begin()
+ while not it.is_end:
+ if func(it) > self._a:
+ count = count+1
+ if count > self._n:
+ return 1
+ it.increment()
+ return 0
+
+class pyDensityUP1D(UnaryPredicate1D):
+ def __init__(self,wsize,threshold, integration = IntegrationType.MEAN, sampling=2.0):
+ UnaryPredicate1D.__init__(self)
+ self._wsize = wsize
+ self._threshold = threshold
+ self._integration = integration
+ self._func = DensityF1D(self._wsize, self._integration, sampling)
+ def __call__(self, inter):
+ if self._func(inter) < self._threshold:
+ return 1
+ return 0
+
+class pyLowSteerableViewMapDensityUP1D(UnaryPredicate1D):
+ def __init__(self,threshold, level,integration = IntegrationType.MEAN):
+ UnaryPredicate1D.__init__(self)
+ self._threshold = threshold
+ self._level = level
+ self._integration = integration
+ def __call__(self, inter):
+ func = GetSteerableViewMapDensityF1D(self._level, self._integration)
+ v = func(inter)
+ #print(v)
+ if v < self._threshold:
+ return 1
+ return 0
+
+class pyLowDirectionalViewMapDensityUP1D(UnaryPredicate1D):
+ def __init__(self,threshold, orientation, level,integration = IntegrationType.MEAN):
+ UnaryPredicate1D.__init__(self)
+ self._threshold = threshold
+ self._orientation = orientation
+ self._level = level
+ self._integration = integration
+ def __call__(self, inter):
+ func = GetDirectionalViewMapDensityF1D(self._orientation, self._level, self._integration)
+ v = func(inter)
+ #print(v)
+ if v < self._threshold:
+ return 1
+ return 0
+
+class pyHighSteerableViewMapDensityUP1D(UnaryPredicate1D):
+ def __init__(self,threshold, level,integration = IntegrationType.MEAN):
+ UnaryPredicate1D.__init__(self)
+ self._threshold = threshold
+ self._level = level
+ self._integration = integration
+ self._func = GetSteerableViewMapDensityF1D(self._level, self._integration)
+ def __call__(self, inter):
+ v = self._func(inter)
+ if v > self._threshold:
+ return 1
+ return 0
+
+class pyHighDirectionalViewMapDensityUP1D(UnaryPredicate1D):
+ def __init__(self,threshold, orientation, level,integration = IntegrationType.MEAN, sampling=2.0):
+ UnaryPredicate1D.__init__(self)
+ self._threshold = threshold
+ self._orientation = orientation
+ self._level = level
+ self._integration = integration
+ self._sampling = sampling
+ def __call__(self, inter):
+ func = GetDirectionalViewMapDensityF1D(self._orientation, self._level, self._integration, self._sampling)
+ v = func(inter)
+ if v > self._threshold:
+ return 1
+ return 0
+
+class pyHighViewMapDensityUP1D(UnaryPredicate1D):
+ def __init__(self,threshold, level,integration = IntegrationType.MEAN, sampling=2.0):
+ UnaryPredicate1D.__init__(self)
+ self._threshold = threshold
+ self._level = level
+ self._integration = integration
+ self._sampling = sampling
+ self._func = GetCompleteViewMapDensityF1D(self._level, self._integration, self._sampling) # 2.0 is the smpling
+ def __call__(self, inter):
+ #print("toto")
+ #print(func.name)
+ #print(inter.name)
+ v= self._func(inter)
+ if v > self._threshold:
+ return 1
+ return 0
+
+class pyDensityFunctorUP1D(UnaryPredicate1D):
+ def __init__(self,wsize,threshold, functor, funcmin=0.0, funcmax=1.0, integration = IntegrationType.MEAN):
+ UnaryPredicate1D.__init__(self)
+ self._wsize = wsize
+ self._threshold = float(threshold)
+ self._functor = functor
+ self._funcmin = float(funcmin)
+ self._funcmax = float(funcmax)
+ self._integration = integration
+ def __call__(self, inter):
+ func = DensityF1D(self._wsize, self._integration)
+ res = self._functor(inter)
+ k = (res-self._funcmin)/(self._funcmax-self._funcmin)
+ if func(inter) < self._threshold*k:
+ return 1
+ return 0
+
+class pyZSmallerUP1D(UnaryPredicate1D):
+ def __init__(self,z, integration=IntegrationType.MEAN):
+ UnaryPredicate1D.__init__(self)
+ self._z = z
+ self._integration = integration
+ def __call__(self, inter):
+ func = GetProjectedZF1D(self._integration)
+ if func(inter) < self._z:
+ return 1
+ return 0
+
+class pyIsOccludedByUP1D(UnaryPredicate1D):
+ def __init__(self,id):
+ UnaryPredicate1D.__init__(self)
+ self._id = id
+ def __call__(self, inter):
+ func = GetShapeF1D()
+ shapes = func(inter)
+ for s in shapes:
+ if(s.id == self._id):
+ return 0
+ it = inter.vertices_begin()
+ itlast = inter.vertices_end()
+ itlast.decrement()
+ v = it.object
+ vlast = itlast.object
+ tvertex = v.viewvertex
+ if type(tvertex) is TVertex:
+ #print("TVertex: [ ", tvertex.id.first, ",", tvertex.id.second," ]")
+ eit = tvertex.edges_begin()
+ while not eit.is_end:
+ ve, incoming = eit.object
+ if ve.id == self._id:
+ return 1
+ #print("-------", ve.id.first, "-", ve.id.second)
+ eit.increment()
+ tvertex = vlast.viewvertex
+ if type(tvertex) is TVertex:
+ #print("TVertex: [ ", tvertex.id.first, ",", tvertex.id.second," ]")
+ eit = tvertex.edges_begin()
+ while not eit.is_end:
+ ve, incoming = eit.object
+ if ve.id == self._id:
+ return 1
+ #print("-------", ve.id.first, "-", ve.id.second)
+ eit.increment()
+ return 0
+
+class pyIsInOccludersListUP1D(UnaryPredicate1D):
+ def __init__(self,id):
+ UnaryPredicate1D.__init__(self)
+ self._id = id
+ def __call__(self, inter):
+ func = GetOccludersF1D()
+ occluders = func(inter)
+ for a in occluders:
+ if a.id == self._id:
+ return 1
+ return 0
+
+class pyIsOccludedByItselfUP1D(UnaryPredicate1D):
+ def __init__(self):
+ UnaryPredicate1D.__init__(self)
+ self.__func1 = GetOccludersF1D()
+ self.__func2 = GetShapeF1D()
+ def __call__(self, inter):
+ lst1 = self.__func1(inter)
+ lst2 = self.__func2(inter)
+ for vs1 in lst1:
+ for vs2 in lst2:
+ if vs1.id == vs2.id:
+ return 1
+ return 0
+
+class pyIsOccludedByIdListUP1D(UnaryPredicate1D):
+ def __init__(self, idlist):
+ UnaryPredicate1D.__init__(self)
+ self._idlist = idlist
+ self.__func1 = GetOccludersF1D()
+ def __call__(self, inter):
+ lst1 = self.__func1(inter)
+ for vs1 in lst1:
+ for _id in self._idlist:
+ if vs1.id == _id:
+ return 1
+ return 0
+
+class pyShapeIdListUP1D(UnaryPredicate1D):
+ def __init__(self,idlist):
+ UnaryPredicate1D.__init__(self)
+ self._idlist = idlist
+ self._funcs = []
+ for _id in idlist :
+ self._funcs.append(ShapeUP1D(_id.first, _id.second))
+ def __call__(self, inter):
+ for func in self._funcs :
+ if func(inter) == 1:
+ return 1
+ return 0
+
+## deprecated
+class pyShapeIdUP1D(UnaryPredicate1D):
+ def __init__(self, _id):
+ UnaryPredicate1D.__init__(self)
+ self._id = _id
+ def __call__(self, inter):
+ func = GetShapeF1D()
+ shapes = func(inter)
+ for a in shapes:
+ if a.id == self._id:
+ return 1
+ return 0
+
+class pyHighDensityAnisotropyUP1D(UnaryPredicate1D):
+ def __init__(self,threshold, level, sampling=2.0):
+ UnaryPredicate1D.__init__(self)
+ self._l = threshold
+ self.func = pyDensityAnisotropyF1D(level, IntegrationType.MEAN, sampling)
+ def __call__(self, inter):
+ return (self.func(inter) > self._l)
+
+class pyHighViewMapGradientNormUP1D(UnaryPredicate1D):
+ def __init__(self,threshold, l, sampling=2.0):
+ UnaryPredicate1D.__init__(self)
+ self._threshold = threshold
+ self._GetGradient = pyViewMapGradientNormF1D(l, IntegrationType.MEAN)
+ def __call__(self, inter):
+ gn = self._GetGradient(inter)
+ #print(gn)
+ return (gn > self._threshold)
+
+class pyDensityVariableSigmaUP1D(UnaryPredicate1D):
+ def __init__(self,functor, sigmaMin,sigmaMax, lmin, lmax, tmin, tmax, integration = IntegrationType.MEAN, sampling=2.0):
+ UnaryPredicate1D.__init__(self)
+ self._functor = functor
+ self._sigmaMin = float(sigmaMin)
+ self._sigmaMax = float(sigmaMax)
+ self._lmin = float(lmin)
+ self._lmax = float(lmax)
+ self._tmin = tmin
+ self._tmax = tmax
+ self._integration = integration
+ self._sampling = sampling
+ def __call__(self, inter):
+ sigma = (self._sigmaMax-self._sigmaMin)/(self._lmax-self._lmin)*(self._functor(inter)-self._lmin) + self._sigmaMin
+ t = (self._tmax-self._tmin)/(self._lmax-self._lmin)*(self._functor(inter)-self._lmin) + self._tmin
+ if sigma < self._sigmaMin:
+ sigma = self._sigmaMin
+ self._func = DensityF1D(sigma, self._integration, self._sampling)
+ d = self._func(inter)
+ if d < t:
+ return 1
+ return 0
+
+class pyClosedCurveUP1D(UnaryPredicate1D):
+ def __call__(self, inter):
+ it = inter.vertices_begin()
+ itlast = inter.vertices_end()
+ itlast.decrement()
+ vlast = itlast.object
+ v = it.object
+ #print(v.id.first, v.id.second)
+ #print(vlast.id.first, vlast.id.second)
+ if v.id == vlast.id:
+ return 1
+ return 0
+
+## Binary predicates for 1D elements (curves)
+#############################################
+
+class pyZBP1D(BinaryPredicate1D):
+ def __call__(self, i1, i2):
+ func = GetZF1D()
+ return (func(i1) > func(i2))
+
+class pyZDiscontinuityBP1D(BinaryPredicate1D):
+ def __init__(self, iType = IntegrationType.MEAN):
+ BinaryPredicate1D.__init__(self)
+ self._GetZDiscontinuity = ZDiscontinuityF1D(iType)
+ def __call__(self, i1, i2):
+ return (self._GetZDiscontinuity(i1) > self._GetZDiscontinuity(i2))
+
+class pyLengthBP1D(BinaryPredicate1D):
+ def __call__(self, i1, i2):
+ return (i1.length_2d > i2.length_2d)
+
+class pySilhouetteFirstBP1D(BinaryPredicate1D):
+ def __call__(self, inter1, inter2):
+ bpred = SameShapeIdBP1D()
+ if (bpred(inter1, inter2) != 1):
+ return 0
+ if (inter1.nature & Nature.SILHOUETTE):
+ return (inter2.nature & Nature.SILHOUETTE) != 0
+ return (inter1.nature == inter2.nature)
+
+class pyNatureBP1D(BinaryPredicate1D):
+ def __call__(self, inter1, inter2):
+ return (inter1.nature & inter2.nature)
+
+class pyViewMapGradientNormBP1D(BinaryPredicate1D):
+ def __init__(self,l, sampling=2.0):
+ BinaryPredicate1D.__init__(self)
+ self._GetGradient = pyViewMapGradientNormF1D(l, IntegrationType.MEAN)
+ def __call__(self, i1,i2):
+ #print("compare gradient")
+ return (self._GetGradient(i1) > self._GetGradient(i2))
+
+class pyShuffleBP1D(BinaryPredicate1D):
+ def __init__(self):
+ BinaryPredicate1D.__init__(self)
+ random.seed(1)
+ def __call__(self, inter1, inter2):
+ r1 = random.uniform(0,1)
+ r2 = random.uniform(0,1)
+ return (r1<r2)
diff --git a/release/scripts/freestyle/modules/freestyle/shaders.py b/release/scripts/freestyle/modules/freestyle/shaders.py
new file mode 100644
index 00000000000..6b1609be709
--- /dev/null
+++ b/release/scripts/freestyle/modules/freestyle/shaders.py
@@ -0,0 +1,1249 @@
+# ##### BEGIN GPL LICENSE BLOCK #####
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+#
+# ##### END GPL LICENSE BLOCK #####
+
+# Filename : shaders.py
+# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
+# Date : 11/08/2005
+# Purpose : Stroke shaders to be used for creation of stylized strokes
+
+# module members
+from _freestyle import (
+ BackboneStretcherShader,
+ BezierCurveShader,
+ CalligraphicShader,
+ ColorNoiseShader,
+ ColorVariationPatternShader,
+ ConstantColorShader,
+ ConstantThicknessShader,
+ ConstrainedIncreasingThicknessShader,
+ GuidingLinesShader,
+ IncreasingColorShader,
+ IncreasingThicknessShader,
+ PolygonalizationShader,
+ SamplingShader,
+ SmoothingShader,
+ SpatialNoiseShader,
+ StrokeTextureShader,
+ TextureAssignerShader,
+ ThicknessNoiseShader,
+ ThicknessVariationPatternShader,
+ TipRemoverShader,
+ fstreamShader,
+ streamShader,
+ )
+
+# modules for implementing shaders
+from freestyle.types import StrokeShader
+from freestyle.predicates import pyVertexNatureUP0D
+from freestyle.utils import ContextFunctions as CF
+import math
+import mathutils
+import random
+
+## thickness modifiers
+######################
+
+class pyDepthDiscontinuityThicknessShader(StrokeShader):
+ def __init__(self, min, max):
+ StrokeShader.__init__(self)
+ self.__min = float(min)
+ self.__max = float(max)
+ self.__func = ZDiscontinuityF0D()
+ def shade(self, stroke):
+ z_min=0.0
+ z_max=1.0
+ a = (self.__max - self.__min)/(z_max-z_min)
+ b = (self.__min*z_max-self.__max*z_min)/(z_max-z_min)
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ z = self.__func(Interface0DIterator(it))
+ thickness = a*z+b
+ it.object.attribute.thickness = (thickness, thickness)
+ it.increment()
+
+class pyConstantThicknessShader(StrokeShader):
+ def __init__(self, thickness):
+ StrokeShader.__init__(self)
+ self._thickness = thickness
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ t = self._thickness/2.0
+ it.object.attribute.thickness = (t, t)
+ it.increment()
+
+class pyFXSVaryingThicknessWithDensityShader(StrokeShader):
+ def __init__(self, wsize, threshold_min, threshold_max, thicknessMin, thicknessMax):
+ StrokeShader.__init__(self)
+ self.wsize= wsize
+ self.threshold_min= threshold_min
+ self.threshold_max= threshold_max
+ self._thicknessMin = thicknessMin
+ self._thicknessMax = thicknessMax
+ def shade(self, stroke):
+ n = stroke.stroke_vertices_size()
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ func = DensityF0D(self.wsize)
+ while not it.is_end:
+ c = func(Interface0DIterator(it))
+ if c < self.threshold_min:
+ c = self.threshold_min
+ if c > self.threshold_max:
+ c = self.threshold_max
+## t = (c - self.threshold_min)/(self.threshold_max - self.threshold_min)*(self._thicknessMax-self._thicknessMin) + self._thicknessMin
+ t = (self.threshold_max - c )/(self.threshold_max - self.threshold_min)*(self._thicknessMax-self._thicknessMin) + self._thicknessMin
+ it.object.attribute.thickness = (t/2.0, t/2.0)
+ i = i+1
+ it.increment()
+
+class pyIncreasingThicknessShader(StrokeShader):
+ def __init__(self, thicknessMin, thicknessMax):
+ StrokeShader.__init__(self)
+ self._thicknessMin = thicknessMin
+ self._thicknessMax = thicknessMax
+ def shade(self, stroke):
+ n = stroke.stroke_vertices_size()
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ c = float(i)/float(n)
+ if i < float(n)/2.0:
+ t = (1.0 - c)*self._thicknessMin + c * self._thicknessMax
+ else:
+ t = (1.0 - c)*self._thicknessMax + c * self._thicknessMin
+ it.object.attribute.thickness = (t/2.0, t/2.0)
+ i = i+1
+ it.increment()
+
+class pyConstrainedIncreasingThicknessShader(StrokeShader):
+ def __init__(self, thicknessMin, thicknessMax, ratio):
+ StrokeShader.__init__(self)
+ self._thicknessMin = thicknessMin
+ self._thicknessMax = thicknessMax
+ self._ratio = ratio
+ def shade(self, stroke):
+ slength = stroke.length_2d
+ tmp = self._ratio*slength
+ maxT = 0.0
+ if tmp < self._thicknessMax:
+ maxT = tmp
+ else:
+ maxT = self._thicknessMax
+ n = stroke.stroke_vertices_size()
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ att = it.object.attribute
+ c = float(i)/float(n)
+ if i < float(n)/2.0:
+ t = (1.0 - c)*self._thicknessMin + c * maxT
+ else:
+ t = (1.0 - c)*maxT + c * self._thicknessMin
+ att.thickness = (t/2.0, t/2.0)
+ if i == n-1:
+ att.thickness = (self._thicknessMin/2.0, self._thicknessMin/2.0)
+ i = i+1
+ it.increment()
+
+class pyDecreasingThicknessShader(StrokeShader):
+ def __init__(self, thicknessMin, thicknessMax):
+ StrokeShader.__init__(self)
+ self._thicknessMin = thicknessMin
+ self._thicknessMax = thicknessMax
+ def shade(self, stroke):
+ l = stroke.length_2d
+ tMax = self._thicknessMax
+ if self._thicknessMax > 0.33*l:
+ tMax = 0.33*l
+ tMin = self._thicknessMin
+ if self._thicknessMin > 0.1*l:
+ tMin = 0.1*l
+ n = stroke.stroke_vertices_size()
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ c = float(i)/float(n)
+ t = (1.0 - c)*tMax +c*tMin
+ it.object.attribute.thickness = (t/2.0, t/2.0)
+ i = i+1
+ it.increment()
+
+class pyNonLinearVaryingThicknessShader(StrokeShader):
+ def __init__(self, thicknessExtremity, thicknessMiddle, exponent):
+ StrokeShader.__init__(self)
+ self._thicknessMin = thicknessMiddle
+ self._thicknessMax = thicknessExtremity
+ self._exponent = exponent
+ def shade(self, stroke):
+ n = stroke.stroke_vertices_size()
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ if i < float(n)/2.0:
+ c = float(i)/float(n)
+ else:
+ c = float(n-i)/float(n)
+ c = self.smoothC(c, self._exponent)
+ t = (1.0 - c)*self._thicknessMax + c * self._thicknessMin
+ it.object.attribute.thickness = (t/2.0, t/2.0)
+ i = i+1
+ it.increment()
+ def smoothC(self, a, exp):
+ return math.pow(float(a), exp) * math.pow(2.0, exp)
+
+## Spherical linear interpolation (cos)
+class pySLERPThicknessShader(StrokeShader):
+ def __init__(self, thicknessMin, thicknessMax, omega=1.2):
+ StrokeShader.__init__(self)
+ self._thicknessMin = thicknessMin
+ self._thicknessMax = thicknessMax
+ self._omega = omega
+ def shade(self, stroke):
+ slength = stroke.length_2d
+ tmp = 0.33*slength
+ maxT = self._thicknessMax
+ if tmp < self._thicknessMax:
+ maxT = tmp
+ n = stroke.stroke_vertices_size()
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ c = float(i)/float(n)
+ if i < float(n)/2.0:
+ t = math.sin((1-c)*self._omega)/math.sinh(self._omega)*self._thicknessMin + math.sin(c*self._omega)/math.sinh(self._omega) * maxT
+ else:
+ t = math.sin((1-c)*self._omega)/math.sinh(self._omega)*maxT + math.sin(c*self._omega)/math.sinh(self._omega) * self._thicknessMin
+ it.object.attribute.thickness = (t/2.0, t/2.0)
+ i = i+1
+ it.increment()
+
+class pyTVertexThickenerShader(StrokeShader): ## FIXME
+ def __init__(self, a=1.5, n=3):
+ StrokeShader.__init__(self)
+ self._a = a
+ self._n = n
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ predTVertex = pyVertexNatureUP0D(Nature.T_VERTEX)
+ while not it.is_end:
+ if predTVertex(it) == 1:
+ it2 = StrokeVertexIterator(it)
+ it2.increment()
+ if not (it.is_begin or it2.is_end):
+ it.increment()
+ continue
+ n = self._n
+ a = self._a
+ if it.is_begin:
+ it3 = StrokeVertexIterator(it)
+ count = 0
+ while (not it3.is_end) and count < n:
+ att = it3.object.attribute
+ (tr, tl) = att.thickness
+ r = (a-1.0)/float(n-1)*(float(n)/float(count+1) - 1) + 1
+ #r = (1.0-a)/float(n-1)*count + a
+ att.thickness = (r*tr, r*tl)
+ it3.increment()
+ count = count + 1
+ if it2.is_end:
+ it4 = StrokeVertexIterator(it)
+ count = 0
+ while (not it4.is_begin) and count < n:
+ att = it4.object.attribute
+ (tr, tl) = att.thickness
+ r = (a-1.0)/float(n-1)*(float(n)/float(count+1) - 1) + 1
+ #r = (1.0-a)/float(n-1)*count + a
+ att.thickness = (r*tr, r*tl)
+ it4.decrement()
+ count = count + 1
+ if it4.is_begin:
+ att = it4.object.attribute
+ (tr, tl) = att.thickness
+ r = (a-1.0)/float(n-1)*(float(n)/float(count+1) - 1) + 1
+ #r = (1.0-a)/float(n-1)*count + a
+ att.thickness = (r*tr, r*tl)
+ it.increment()
+
+class pyImportance2DThicknessShader(StrokeShader):
+ def __init__(self, x, y, w, kmin, kmax):
+ StrokeShader.__init__(self)
+ self._x = x
+ self._y = y
+ self._w = float(w)
+ self._kmin = float(kmin)
+ self._kmax = float(kmax)
+ def shade(self, stroke):
+ origin = mathutils.Vector([self._x, self._y])
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ v = it.object
+ p = mathutils.Vector([v.projected_x, v.projected_y])
+ d = (p-origin).length
+ if d > self._w:
+ k = self._kmin
+ else:
+ k = (self._kmax*(self._w-d) + self._kmin*d)/self._w
+ att = v.attribute
+ (tr, tl) = att.thickness
+ att.thickness = (k*tr/2.0, k*tl/2.0)
+ it.increment()
+
+class pyImportance3DThicknessShader(StrokeShader):
+ def __init__(self, x, y, z, w, kmin, kmax):
+ StrokeShader.__init__(self)
+ self._x = x
+ self._y = y
+ self._z = z
+ self._w = float(w)
+ self._kmin = float(kmin)
+ self._kmax = float(kmax)
+ def shade(self, stroke):
+ origin = mathutils.Vector([self._x, self._y, self._z])
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ v = it.object
+ p = v.point_3d
+ d = (p-origin).length
+ if d > self._w:
+ k = self._kmin
+ else:
+ k = (self._kmax*(self._w-d) + self._kmin*d)/self._w
+ att = v.attribute
+ (tr, tl) = att.thickness
+ att.thickness = (k*tr/2.0, k*tl/2.0)
+ it.increment()
+
+class pyZDependingThicknessShader(StrokeShader):
+ def __init__(self, min, max):
+ StrokeShader.__init__(self)
+ self.__min = min
+ self.__max = max
+ self.__func = GetProjectedZF0D()
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ z_min = 1
+ z_max = 0
+ while not it.is_end:
+ z = self.__func(Interface0DIterator(it))
+ if z < z_min:
+ z_min = z
+ if z > z_max:
+ z_max = z
+ it.increment()
+ z_diff = 1 / (z_max - z_min)
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ z = (self.__func(Interface0DIterator(it)) - z_min) * z_diff
+ thickness = (1 - z) * self.__max + z * self.__min
+ it.object.attribute.thickness = (thickness, thickness)
+ it.increment()
+
+
+## color modifiers
+##################
+
+class pyConstantColorShader(StrokeShader):
+ def __init__(self,r,g,b, a = 1):
+ StrokeShader.__init__(self)
+ self._r = r
+ self._g = g
+ self._b = b
+ self._a = a
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ att = it.object.attribute
+ att.color = (self._r, self._g, self._b)
+ att.alpha = self._a
+ it.increment()
+
+#c1->c2
+class pyIncreasingColorShader(StrokeShader):
+ def __init__(self,r1,g1,b1,a1, r2,g2,b2,a2):
+ StrokeShader.__init__(self)
+ self._c1 = [r1,g1,b1,a1]
+ self._c2 = [r2,g2,b2,a2]
+ def shade(self, stroke):
+ n = stroke.stroke_vertices_size() - 1
+ inc = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ att = it.object.attribute
+ c = float(inc)/float(n)
+
+ att.color = ((1-c)*self._c1[0] + c*self._c2[0],
+ (1-c)*self._c1[1] + c*self._c2[1],
+ (1-c)*self._c1[2] + c*self._c2[2])
+ att.alpha = (1-c)*self._c1[3] + c*self._c2[3]
+ inc = inc+1
+ it.increment()
+
+# c1->c2->c1
+class pyInterpolateColorShader(StrokeShader):
+ def __init__(self,r1,g1,b1,a1, r2,g2,b2,a2):
+ StrokeShader.__init__(self)
+ self._c1 = [r1,g1,b1,a1]
+ self._c2 = [r2,g2,b2,a2]
+ def shade(self, stroke):
+ n = stroke.stroke_vertices_size() - 1
+ inc = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ att = it.object.attribute
+ u = float(inc)/float(n)
+ c = 1-2*(math.fabs(u-0.5))
+ att.color = ((1-c)*self._c1[0] + c*self._c2[0],
+ (1-c)*self._c1[1] + c*self._c2[1],
+ (1-c)*self._c1[2] + c*self._c2[2])
+ att.alpha = (1-c)*self._c1[3] + c*self._c2[3]
+ inc = inc+1
+ it.increment()
+
+class pyMaterialColorShader(StrokeShader):
+ def __init__(self, threshold=50):
+ StrokeShader.__init__(self)
+ self._threshold = threshold
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ func = MaterialF0D()
+ xn = 0.312713
+ yn = 0.329016
+ Yn = 1.0
+ un = 4.* xn/ ( -2.*xn + 12.*yn + 3. )
+ vn= 9.* yn/ ( -2.*xn + 12.*yn +3. )
+ while not it.is_end:
+ mat = func(Interface0DIterator(it))
+
+ r = mat.diffuse[0]
+ g = mat.diffuse[1]
+ b = mat.diffuse[2]
+
+ X = 0.412453*r + 0.35758 *g + 0.180423*b
+ Y = 0.212671*r + 0.71516 *g + 0.072169*b
+ Z = 0.019334*r + 0.119193*g + 0.950227*b
+
+ if X == 0 and Y == 0 and Z == 0:
+ X = 0.01
+ Y = 0.01
+ Z = 0.01
+ u = 4.*X / (X + 15.*Y + 3.*Z)
+ v = 9.*Y / (X + 15.*Y + 3.*Z)
+
+ L= 116. * math.pow((Y/Yn),(1./3.)) -16
+ U = 13. * L * (u - un)
+ V = 13. * L * (v - vn)
+
+ if L > self._threshold:
+ L = L/1.3
+ U = U+10
+ else:
+ L = L +2.5*(100-L)/5.
+ U = U/3.0
+ V = V/3.0
+ u = U / (13. * L) + un
+ v = V / (13. * L) + vn
+
+ Y = Yn * math.pow( ((L+16.)/116.), 3.)
+ X = -9. * Y * u / ((u - 4.)* v - u * v)
+ Z = (9. * Y - 15*v*Y - v*X) /( 3. * v)
+
+ r = 3.240479 * X - 1.53715 * Y - 0.498535 * Z
+ g = -0.969256 * X + 1.875991 * Y + 0.041556 * Z
+ b = 0.055648 * X - 0.204043 * Y + 1.057311 * Z
+
+ r = max(0,r)
+ g = max(0,g)
+ b = max(0,b)
+
+ it.object.attribute.color = (r, g, b)
+ it.increment()
+
+class pyRandomColorShader(StrokeShader):
+ def __init__(self, s=1):
+ StrokeShader.__init__(self)
+ random.seed(s)
+ def shade(self, stroke):
+ ## pick a random color
+ c0 = float(random.uniform(15,75))/100.0
+ c1 = float(random.uniform(15,75))/100.0
+ c2 = float(random.uniform(15,75))/100.0
+ #print(c0, c1, c2)
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ it.object.attribute.color = (c0,c1,c2)
+ it.increment()
+
+class py2DCurvatureColorShader(StrokeShader):
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ func = Curvature2DAngleF0D()
+ while not it.is_end:
+ c = func(Interface0DIterator(it))
+ if c < 0:
+ print("negative 2D curvature")
+ color = 10.0 * c/3.1415
+ it.object.attribute.color = (color, color, color)
+ it.increment()
+
+class pyTimeColorShader(StrokeShader):
+ def __init__(self, step=0.01):
+ StrokeShader.__init__(self)
+ self._t = 0
+ self._step = step
+ def shade(self, stroke):
+ c = self._t*1.0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ it.object.attribute.color = (c,c,c)
+ it.increment()
+ self._t = self._t+self._step
+
+## geometry modifiers
+
+class pySamplingShader(StrokeShader):
+ def __init__(self, sampling):
+ StrokeShader.__init__(self)
+ self._sampling = sampling
+ def shade(self, stroke):
+ stroke.resample(float(self._sampling))
+ stroke.update_length()
+
+class pyBackboneStretcherShader(StrokeShader):
+ def __init__(self, l):
+ StrokeShader.__init__(self)
+ self._l = l
+ def shade(self, stroke):
+ it0 = stroke.stroke_vertices_begin()
+ it1 = StrokeVertexIterator(it0)
+ it1.increment()
+ itn = stroke.stroke_vertices_end()
+ itn.decrement()
+ itn_1 = StrokeVertexIterator(itn)
+ itn_1.decrement()
+ v0 = it0.object
+ v1 = it1.object
+ vn_1 = itn_1.object
+ vn = itn.object
+ p0 = mathutils.Vector([v0.projected_x, v0.projected_y])
+ pn = mathutils.Vector([vn.projected_x, vn.projected_y])
+ p1 = mathutils.Vector([v1.projected_x, v1.projected_y])
+ pn_1 = mathutils.Vector([vn_1.projected_x, vn_1.projected_y])
+ d1 = p0-p1
+ d1.normalize()
+ dn = pn-pn_1
+ dn.normalize()
+ newFirst = p0+d1*float(self._l)
+ newLast = pn+dn*float(self._l)
+ v0.point = newFirst
+ vn.point = newLast
+ stroke.update_length()
+
+class pyLengthDependingBackboneStretcherShader(StrokeShader):
+ def __init__(self, l):
+ StrokeShader.__init__(self)
+ self._l = l
+ def shade(self, stroke):
+ l = stroke.length_2d
+ stretch = self._l*l
+ it0 = stroke.stroke_vertices_begin()
+ it1 = StrokeVertexIterator(it0)
+ it1.increment()
+ itn = stroke.stroke_vertices_end()
+ itn.decrement()
+ itn_1 = StrokeVertexIterator(itn)
+ itn_1.decrement()
+ v0 = it0.object
+ v1 = it1.object
+ vn_1 = itn_1.object
+ vn = itn.object
+ p0 = mathutils.Vector([v0.projected_x, v0.projected_y])
+ pn = mathutils.Vector([vn.projected_x, vn.projected_y])
+ p1 = mathutils.Vector([v1.projected_x, v1.projected_y])
+ pn_1 = mathutils.Vector([vn_1.projected_x, vn_1.projected_y])
+ d1 = p0-p1
+ d1.normalize()
+ dn = pn-pn_1
+ dn.normalize()
+ newFirst = p0+d1*float(stretch)
+ newLast = pn+dn*float(stretch)
+ v0.point = newFirst
+ vn.point = newLast
+ stroke.update_length()
+
+
+## Shader to replace a stroke by its corresponding tangent
+class pyGuidingLineShader(StrokeShader):
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin() ## get the first vertex
+ itlast = stroke.stroke_vertices_end() ##
+ itlast.decrement() ## get the last one
+ t = itlast.object.point - it.object.point ## tangent direction
+ itmiddle = StrokeVertexIterator(it) ##
+ while itmiddle.object.u < 0.5: ## look for the stroke middle vertex
+ itmiddle.increment() ##
+ it = StrokeVertexIterator(itmiddle)
+ it.increment()
+ while not it.is_end: ## position all the vertices along the tangent for the right part
+ it.object.point = itmiddle.object.point+t*(it.object.u-itmiddle.object.u)
+ it.increment()
+ it = StrokeVertexIterator(itmiddle)
+ it.decrement()
+ while not it.is_begin: ## position all the vertices along the tangent for the left part
+ it.object.point = itmiddle.object.point-t*(itmiddle.object.u-it.object.u)
+ it.decrement()
+ it.object.point = itmiddle.object.point-t*itmiddle.object.u ## first vertex
+ stroke.update_length()
+
+
+class pyBackboneStretcherNoCuspShader(StrokeShader):
+ def __init__(self, l):
+ StrokeShader.__init__(self)
+ self._l = l
+ def shade(self, stroke):
+ it0 = stroke.stroke_vertices_begin()
+ it1 = StrokeVertexIterator(it0)
+ it1.increment()
+ itn = stroke.stroke_vertices_end()
+ itn.decrement()
+ itn_1 = StrokeVertexIterator(itn)
+ itn_1.decrement()
+ v0 = it0.object
+ v1 = it1.object
+ if (v0.nature & Nature.CUSP) == 0 and (v1.nature & Nature.CUSP) == 0:
+ p0 = v0.point
+ p1 = v1.point
+ d1 = p0-p1
+ d1.normalize()
+ newFirst = p0+d1*float(self._l)
+ v0.point = newFirst
+ vn_1 = itn_1.object
+ vn = itn.object
+ if (vn.nature & Nature.CUSP) == 0 and (vn_1.nature & Nature.CUSP) == 0:
+ pn = vn.point
+ pn_1 = vn_1.point
+ dn = pn-pn_1
+ dn.normalize()
+ newLast = pn+dn*float(self._l)
+ vn.point = newLast
+ stroke.update_length()
+
+class pyDiffusion2Shader(StrokeShader):
+ """This shader iteratively adds an offset to the position of each
+ stroke vertex in the direction perpendicular to the stroke direction
+ at the point. The offset is scaled by the 2D curvature (i.e., how
+ quickly the stroke curve is) at the point."""
+ def __init__(self, lambda1, nbIter):
+ StrokeShader.__init__(self)
+ self._lambda = lambda1
+ self._nbIter = nbIter
+ self._normalInfo = Normal2DF0D()
+ self._curvatureInfo = Curvature2DAngleF0D()
+ def shade(self, stroke):
+ for i in range (1, self._nbIter):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ v = it.object
+ p1 = v.point
+ p2 = self._normalInfo(Interface0DIterator(it))*self._lambda*self._curvatureInfo(Interface0DIterator(it))
+ v.point = p1+p2
+ it.increment()
+ stroke.update_length()
+
+class pyTipRemoverShader(StrokeShader):
+ def __init__(self, l):
+ StrokeShader.__init__(self)
+ self._l = l
+ def shade(self, stroke):
+ originalSize = stroke.stroke_vertices_size()
+ if originalSize < 4:
+ return
+ verticesToRemove = []
+ oldAttributes = []
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ v = it.object
+ if v.curvilinear_abscissa < self._l or v.stroke_length-v.curvilinear_abscissa < self._l:
+ verticesToRemove.append(v)
+ oldAttributes.append(StrokeAttribute(v.attribute))
+ it.increment()
+ if originalSize-len(verticesToRemove) < 2:
+ return
+ for sv in verticesToRemove:
+ stroke.remove_vertex(sv)
+ stroke.update_length()
+ stroke.resample(originalSize)
+ if stroke.stroke_vertices_size() != originalSize:
+ print("pyTipRemover: Warning: resampling problem")
+ it = stroke.stroke_vertices_begin()
+ for a in oldAttributes:
+ if it.is_end:
+ break
+ it.object.attribute = a
+ it.increment()
+ stroke.update_length()
+
+class pyTVertexRemoverShader(StrokeShader):
+ def shade(self, stroke):
+ if stroke.stroke_vertices_size() <= 3:
+ return
+ predTVertex = pyVertexNatureUP0D(Nature.T_VERTEX)
+ it = stroke.stroke_vertices_begin()
+ itlast = stroke.stroke_vertices_end()
+ itlast.decrement()
+ if predTVertex(it):
+ stroke.remove_vertex(it.object)
+ if predTVertex(itlast):
+ stroke.remove_vertex(itlast.object)
+ stroke.update_length()
+
+#class pyExtremitiesOrientationShader(StrokeShader):
+# def __init__(self, x1,y1,x2=0,y2=0):
+# StrokeShader.__init__(self)
+# self._v1 = mathutils.Vector([x1,y1])
+# self._v2 = mathutils.Vector([x2,y2])
+# def shade(self, stroke):
+# #print(self._v1.x,self._v1.y)
+# stroke.setBeginningOrientation(self._v1.x,self._v1.y)
+# stroke.setEndingOrientation(self._v2.x,self._v2.y)
+
+class pyHLRShader(StrokeShader):
+ def shade(self, stroke):
+ originalSize = stroke.stroke_vertices_size()
+ if originalSize < 4:
+ return
+ it = stroke.stroke_vertices_begin()
+ invisible = 0
+ it2 = StrokeVertexIterator(it)
+ it2.increment()
+ fe = self.get_fedge(it.object, it2.object)
+ if fe.viewedge.qi != 0:
+ invisible = 1
+ while not it2.is_end:
+ v = it.object
+ vnext = it2.object
+ if (v.nature & Nature.VIEW_VERTEX) != 0:
+ #if (v.nature & Nature.T_VERTEX) != 0:
+ fe = self.get_fedge(v, vnext)
+ qi = fe.viewedge.qi
+ if qi != 0:
+ invisible = 1
+ else:
+ invisible = 0
+ if invisible:
+ v.attribute.visible = False
+ it.increment()
+ it2.increment()
+ def get_fedge(self, it1, it2):
+ return it1.get_fedge(it2)
+
+class pyTVertexOrientationShader(StrokeShader):
+ def __init__(self):
+ StrokeShader.__init__(self)
+ self._Get2dDirection = Orientation2DF1D()
+ ## finds the TVertex orientation from the TVertex and
+ ## the previous or next edge
+ def findOrientation(self, tv, ve):
+ mateVE = tv.get_mate(ve)
+ if ve.qi != 0 or mateVE.qi != 0:
+ ait = AdjacencyIterator(tv,1,0)
+ winner = None
+ incoming = True
+ while not ait.is_end:
+ ave = ait.object
+ if ave.id != ve.id and ave.id != mateVE.id:
+ winner = ait.object
+ if not ait.isIncoming(): # FIXME
+ incoming = False
+ break
+ ait.increment()
+ if winner is not None:
+ if not incoming:
+ direction = self._Get2dDirection(winner.last_fedge)
+ else:
+ direction = self._Get2dDirection(winner.first_fedge)
+ return direction
+ return None
+ def castToTVertex(self, cp):
+ if cp.t2d() == 0.0:
+ return cp.first_svertex.viewvertex
+ elif cp.t2d() == 1.0:
+ return cp.second_svertex.viewvertex
+ return None
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ it2 = StrokeVertexIterator(it)
+ it2.increment()
+ ## case where the first vertex is a TVertex
+ v = it.object
+ if (v.nature & Nature.T_VERTEX) != 0:
+ tv = self.castToTVertex(v)
+ if tv is not None:
+ ve = self.get_fedge(v, it2.object).viewedge
+ dir = self.findOrientation(tv, ve)
+ if dir is not None:
+ #print(dir.x, dir.y)
+ v.attribute.set_attribute_vec2("orientation", dir)
+ while not it2.is_end:
+ vprevious = it.object
+ v = it2.object
+ if (v.nature & Nature.T_VERTEX) != 0:
+ tv = self.castToTVertex(v)
+ if tv is not None:
+ ve = self.get_fedge(vprevious, v).viewedge
+ dir = self.findOrientation(tv, ve)
+ if dir is not None:
+ #print(dir.x, dir.y)
+ v.attribute.set_attribute_vec2("orientation", dir)
+ it.increment()
+ it2.increment()
+ ## case where the last vertex is a TVertex
+ v = it.object
+ if (v.nature & Nature.T_VERTEX) != 0:
+ itPrevious = StrokeVertexIterator(it)
+ itPrevious.decrement()
+ tv = self.castToTVertex(v)
+ if tv is not None:
+ ve = self.get_fedge(itPrevious.object, v).viewedge
+ dir = self.findOrientation(tv, ve)
+ if dir is not None:
+ #print(dir.x, dir.y)
+ v.attribute.set_attribute_vec2("orientation", dir)
+ def get_fedge(self, it1, it2):
+ return it1.get_fedge(it2)
+
+class pySinusDisplacementShader(StrokeShader):
+ def __init__(self, f, a):
+ StrokeShader.__init__(self)
+ self._f = f
+ self._a = a
+ self._getNormal = Normal2DF0D()
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ v = it.object
+ #print(self._getNormal.name)
+ n = self._getNormal(Interface0DIterator(it))
+ p = v.point
+ u = v.u
+ a = self._a*(1-2*(math.fabs(u-0.5)))
+ n = n*a*math.cos(self._f*u*6.28)
+ #print(n.x, n.y)
+ v.point = p+n
+ #v.point = v.point+n*a*math.cos(f*v.u)
+ it.increment()
+ stroke.update_length()
+
+class pyPerlinNoise1DShader(StrokeShader):
+ def __init__(self, freq = 10, amp = 10, oct = 4, seed = -1):
+ StrokeShader.__init__(self)
+ self.__noise = Noise(seed)
+ self.__freq = freq
+ self.__amp = amp
+ self.__oct = oct
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ v = it.object
+ i = v.projected_x + v.projected_y
+ nres = self.__noise.turbulence1(i, self.__freq, self.__amp, self.__oct)
+ v.point = (v.projected_x + nres, v.projected_y + nres)
+ it.increment()
+ stroke.update_length()
+
+class pyPerlinNoise2DShader(StrokeShader):
+ def __init__(self, freq = 10, amp = 10, oct = 4, seed = -1):
+ StrokeShader.__init__(self)
+ self.__noise = Noise(seed)
+ self.__freq = freq
+ self.__amp = amp
+ self.__oct = oct
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ v = it.object
+ vec = mathutils.Vector([v.projected_x, v.projected_y])
+ nres = self.__noise.turbulence2(vec, self.__freq, self.__amp, self.__oct)
+ v.point = (v.projected_x + nres, v.projected_y + nres)
+ it.increment()
+ stroke.update_length()
+
+class pyBluePrintCirclesShader(StrokeShader):
+ def __init__(self, turns = 1, random_radius = 3, random_center = 5):
+ StrokeShader.__init__(self)
+ self.__turns = turns
+ self.__random_center = random_center
+ self.__random_radius = random_radius
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ if it.is_end:
+ return
+ p_min = it.object.point.copy()
+ p_max = it.object.point.copy()
+ while not it.is_end:
+ p = it.object.point
+ if p.x < p_min.x:
+ p_min.x = p.x
+ if p.x > p_max.x:
+ p_max.x = p.x
+ if p.y < p_min.y:
+ p_min.y = p.y
+ if p.y > p_max.y:
+ p_max.y = p.y
+ it.increment()
+ stroke.resample(32 * self.__turns)
+ sv_nb = stroke.stroke_vertices_size()
+# print("min :", p_min.x, p_min.y) # DEBUG
+# print("mean :", p_sum.x, p_sum.y) # DEBUG
+# print("max :", p_max.x, p_max.y) # DEBUG
+# print("----------------------") # DEBUG
+#######################################################
+ sv_nb = sv_nb // self.__turns
+ center = (p_min + p_max) / 2
+ radius = (center.x - p_min.x + center.y - p_min.y) / 2
+ p_new = mathutils.Vector([0, 0])
+#######################################################
+ R = self.__random_radius
+ C = self.__random_center
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ for j in range(self.__turns):
+ prev_radius = radius
+ prev_center = center
+ radius = radius + random.randint(-R, R)
+ center = center + mathutils.Vector([random.randint(-C, C), random.randint(-C, C)])
+ while i < sv_nb and not it.is_end:
+ t = float(i) / float(sv_nb - 1)
+ r = prev_radius + (radius - prev_radius) * t
+ c = prev_center + (center - prev_center) * t
+ p_new.x = c.x + r * math.cos(2 * math.pi * t)
+ p_new.y = c.y + r * math.sin(2 * math.pi * t)
+ it.object.point = p_new
+ i = i + 1
+ it.increment()
+ i = 1
+ verticesToRemove = []
+ while not it.is_end:
+ verticesToRemove.append(it.object)
+ it.increment()
+ for sv in verticesToRemove:
+ stroke.remove_vertex(sv)
+ stroke.update_length()
+
+class pyBluePrintEllipsesShader(StrokeShader):
+ def __init__(self, turns = 1, random_radius = 3, random_center = 5):
+ StrokeShader.__init__(self)
+ self.__turns = turns
+ self.__random_center = random_center
+ self.__random_radius = random_radius
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ if it.is_end:
+ return
+ p_min = it.object.point.copy()
+ p_max = it.object.point.copy()
+ while not it.is_end:
+ p = it.object.point
+ if p.x < p_min.x:
+ p_min.x = p.x
+ if p.x > p_max.x:
+ p_max.x = p.x
+ if p.y < p_min.y:
+ p_min.y = p.y
+ if p.y > p_max.y:
+ p_max.y = p.y
+ it.increment()
+ stroke.resample(32 * self.__turns)
+ sv_nb = stroke.stroke_vertices_size()
+ sv_nb = sv_nb // self.__turns
+ center = (p_min + p_max) / 2
+ radius = center - p_min
+ p_new = mathutils.Vector([0, 0])
+#######################################################
+ R = self.__random_radius
+ C = self.__random_center
+ i = 0
+ it = stroke.stroke_vertices_begin()
+ for j in range(self.__turns):
+ prev_radius = radius
+ prev_center = center
+ radius = radius + mathutils.Vector([random.randint(-R, R), random.randint(-R, R)])
+ center = center + mathutils.Vector([random.randint(-C, C), random.randint(-C, C)])
+ while i < sv_nb and not it.is_end:
+ t = float(i) / float(sv_nb - 1)
+ r = prev_radius + (radius - prev_radius) * t
+ c = prev_center + (center - prev_center) * t
+ p_new.x = c.x + r.x * math.cos(2 * math.pi * t)
+ p_new.y = c.y + r.y * math.sin(2 * math.pi * t)
+ it.object.point = p_new
+ i = i + 1
+ it.increment()
+ i = 1
+ verticesToRemove = []
+ while not it.is_end:
+ verticesToRemove.append(it.object)
+ it.increment()
+ for sv in verticesToRemove:
+ stroke.remove_vertex(sv)
+ stroke.update_length()
+
+
+class pyBluePrintSquaresShader(StrokeShader):
+ def __init__(self, turns = 1, bb_len = 10, bb_rand = 0):
+ StrokeShader.__init__(self)
+ self.__turns = turns
+ self.__bb_len = bb_len
+ self.__bb_rand = bb_rand
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ if it.is_end:
+ return
+ p_min = it.object.point.copy()
+ p_max = it.object.point.copy()
+ while not it.is_end:
+ p = it.object.point
+ if p.x < p_min.x:
+ p_min.x = p.x
+ if p.x > p_max.x:
+ p_max.x = p.x
+ if p.y < p_min.y:
+ p_min.y = p.y
+ if p.y > p_max.y:
+ p_max.y = p.y
+ it.increment()
+ stroke.resample(32 * self.__turns)
+ sv_nb = stroke.stroke_vertices_size()
+#######################################################
+ sv_nb = sv_nb // self.__turns
+ first = sv_nb // 4
+ second = 2 * first
+ third = 3 * first
+ fourth = sv_nb
+ p_first = mathutils.Vector([p_min.x - self.__bb_len, p_min.y])
+ p_first_end = mathutils.Vector([p_max.x + self.__bb_len, p_min.y])
+ p_second = mathutils.Vector([p_max.x, p_min.y - self.__bb_len])
+ p_second_end = mathutils.Vector([p_max.x, p_max.y + self.__bb_len])
+ p_third = mathutils.Vector([p_max.x + self.__bb_len, p_max.y])
+ p_third_end = mathutils.Vector([p_min.x - self.__bb_len, p_max.y])
+ p_fourth = mathutils.Vector([p_min.x, p_max.y + self.__bb_len])
+ p_fourth_end = mathutils.Vector([p_min.x, p_min.y - self.__bb_len])
+#######################################################
+ R = self.__bb_rand
+ r = self.__bb_rand // 2
+ it = stroke.stroke_vertices_begin()
+ visible = True
+ for j in range(self.__turns):
+ p_first = p_first + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
+ p_first_end = p_first_end + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
+ p_second = p_second + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
+ p_second_end = p_second_end + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
+ p_third = p_third + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
+ p_third_end = p_third_end + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
+ p_fourth = p_fourth + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
+ p_fourth_end = p_fourth_end + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
+ vec_first = p_first_end - p_first
+ vec_second = p_second_end - p_second
+ vec_third = p_third_end - p_third
+ vec_fourth = p_fourth_end - p_fourth
+ i = 0
+ while i < sv_nb and not it.is_end:
+ if i < first:
+ p_new = p_first + vec_first * float(i)/float(first - 1)
+ if i == first - 1:
+ visible = False
+ elif i < second:
+ p_new = p_second + vec_second * float(i - first)/float(second - first - 1)
+ if i == second - 1:
+ visible = False
+ elif i < third:
+ p_new = p_third + vec_third * float(i - second)/float(third - second - 1)
+ if i == third - 1:
+ visible = False
+ else:
+ p_new = p_fourth + vec_fourth * float(i - third)/float(fourth - third - 1)
+ if i == fourth - 1:
+ visible = False
+ if it.object == None:
+ i = i + 1
+ it.increment()
+ if not visible:
+ visible = True
+ continue
+ it.object.point = p_new
+ it.object.attribute.visible = visible
+ if not visible:
+ visible = True
+ i = i + 1
+ it.increment()
+ verticesToRemove = []
+ while not it.is_end:
+ verticesToRemove.append(it.object)
+ it.increment()
+ for sv in verticesToRemove:
+ stroke.remove_vertex(sv)
+ stroke.update_length()
+
+
+class pyBluePrintDirectedSquaresShader(StrokeShader):
+ def __init__(self, turns = 1, bb_len = 10, mult = 1):
+ StrokeShader.__init__(self)
+ self.__mult = mult
+ self.__turns = turns
+ self.__bb_len = 1 + float(bb_len) / 100
+ def shade(self, stroke):
+ stroke.resample(32 * self.__turns)
+ p_mean = mathutils.Vector([0, 0])
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ p = it.object.point
+ p_mean = p_mean + p
+ it.increment()
+ sv_nb = stroke.stroke_vertices_size()
+ p_mean = p_mean / sv_nb
+ p_var_xx = 0
+ p_var_yy = 0
+ p_var_xy = 0
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ p = it.object.point
+ p_var_xx = p_var_xx + math.pow(p.x - p_mean.x, 2)
+ p_var_yy = p_var_yy + math.pow(p.y - p_mean.y, 2)
+ p_var_xy = p_var_xy + (p.x - p_mean.x) * (p.y - p_mean.y)
+ it.increment()
+ p_var_xx = p_var_xx / sv_nb
+ p_var_yy = p_var_yy / sv_nb
+ p_var_xy = p_var_xy / sv_nb
+## print(p_var_xx, p_var_yy, p_var_xy)
+ trace = p_var_xx + p_var_yy
+ det = p_var_xx * p_var_yy - p_var_xy * p_var_xy
+ sqrt_coeff = math.sqrt(trace * trace - 4 * det)
+ lambda1 = (trace + sqrt_coeff) / 2
+ lambda2 = (trace - sqrt_coeff) / 2
+## print(lambda1, lambda2)
+ theta = math.atan(2 * p_var_xy / (p_var_xx - p_var_yy)) / 2
+## print(theta)
+ if p_var_yy > p_var_xx:
+ e1 = mathutils.Vector([math.cos(theta + math.pi / 2), math.sin(theta + math.pi / 2)]) * math.sqrt(lambda1) * self.__mult
+ e2 = mathutils.Vector([math.cos(theta + math.pi), math.sin(theta + math.pi)]) * math.sqrt(lambda2) * self.__mult
+ else:
+ e1 = mathutils.Vector([math.cos(theta), math.sin(theta)]) * math.sqrt(lambda1) * self.__mult
+ e2 = mathutils.Vector([math.cos(theta + math.pi / 2), math.sin(theta + math.pi / 2)]) * math.sqrt(lambda2) * self.__mult
+#######################################################
+ sv_nb = sv_nb // self.__turns
+ first = sv_nb // 4
+ second = 2 * first
+ third = 3 * first
+ fourth = sv_nb
+ bb_len1 = self.__bb_len
+ bb_len2 = 1 + (bb_len1 - 1) * math.sqrt(lambda1 / lambda2)
+ p_first = p_mean - e1 - e2 * bb_len2
+ p_second = p_mean - e1 * bb_len1 + e2
+ p_third = p_mean + e1 + e2 * bb_len2
+ p_fourth = p_mean + e1 * bb_len1 - e2
+ vec_first = e2 * bb_len2 * 2
+ vec_second = e1 * bb_len1 * 2
+ vec_third = vec_first * -1
+ vec_fourth = vec_second * -1
+#######################################################
+ it = stroke.stroke_vertices_begin()
+ visible = True
+ for j in range(self.__turns):
+ i = 0
+ while i < sv_nb:
+ if i < first:
+ p_new = p_first + vec_first * float(i)/float(first - 1)
+ if i == first - 1:
+ visible = False
+ elif i < second:
+ p_new = p_second + vec_second * float(i - first)/float(second - first - 1)
+ if i == second - 1:
+ visible = False
+ elif i < third:
+ p_new = p_third + vec_third * float(i - second)/float(third - second - 1)
+ if i == third - 1:
+ visible = False
+ else:
+ p_new = p_fourth + vec_fourth * float(i - third)/float(fourth - third - 1)
+ if i == fourth - 1:
+ visible = False
+ it.object.point = p_new
+ it.object.attribute.visible = visible
+ if not visible:
+ visible = True
+ i = i + 1
+ it.increment()
+ verticesToRemove = []
+ while not it.is_end:
+ verticesToRemove.append(it.object)
+ it.increment()
+ for sv in verticesToRemove:
+ stroke.remove_vertex(sv)
+ stroke.update_length()
+
+class pyModulateAlphaShader(StrokeShader):
+ def __init__(self, min = 0, max = 1):
+ StrokeShader.__init__(self)
+ self.__min = min
+ self.__max = max
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ alpha = it.object.attribute.alpha
+ p = it.object.point
+ alpha = alpha * p.y / 400
+ if alpha < self.__min:
+ alpha = self.__min
+ elif alpha > self.__max:
+ alpha = self.__max
+ it.object.attribute.alpha = alpha
+ it.increment()
+
+## various
+class pyDummyShader(StrokeShader):
+ def shade(self, stroke):
+ it = stroke.stroke_vertices_begin()
+ while not it.is_end:
+ toto = Interface0DIterator(it)
+ att = it.object.attribute
+ att.color = (0.3, 0.4, 0.4)
+ att.thickness = (0, 5)
+ it.increment()
+
+class pyDebugShader(StrokeShader):
+ def shade(self, stroke):
+ fe = CF.get_selected_fedge()
+ id1 = fe.first_svertex.id
+ id2 = fe.second_svertex.id
+ #print(id1.first, id1.second)
+ #print(id2.first, id2.second)
+ it = stroke.stroke_vertices_begin()
+ found = True
+ foundfirst = True
+ foundsecond = False
+ while not it.is_end:
+ cp = it.object
+ if cp.first_svertex.id == id1 or cp.second_svertex.id == id1:
+ foundfirst = True
+ if cp.first_svertex.id == id2 or cp.second_svertex.id == id2:
+ foundsecond = True
+ if foundfirst and foundsecond:
+ found = True
+ break
+ it.increment()
+ if found:
+ print("The selected Stroke id is: ", stroke.id.first, stroke.id.second)
diff --git a/release/scripts/freestyle/modules/freestyle/types.py b/release/scripts/freestyle/modules/freestyle/types.py
new file mode 100644
index 00000000000..d7f7078e1aa
--- /dev/null
+++ b/release/scripts/freestyle/modules/freestyle/types.py
@@ -0,0 +1,81 @@
+# ##### BEGIN GPL LICENSE BLOCK #####
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+#
+# ##### END GPL LICENSE BLOCK #####
+
+# module members
+from _freestyle import (
+ AdjacencyIterator,
+ BBox,
+ BinaryPredicate0D,
+ BinaryPredicate1D,
+ Chain,
+ ChainingIterator,
+ Curve,
+ CurvePoint,
+ CurvePointIterator,
+ FEdge,
+ FEdgeSharp,
+ FEdgeSmooth,
+ Id,
+ IntegrationType,
+ Interface0D,
+ Interface0DIterator,
+ Interface1D,
+ Iterator,
+ Material,
+ MediumType,
+ Nature,
+ Noise,
+ NonTVertex,
+ SShape,
+ SVertex,
+ SVertexIterator,
+ Stroke,
+ StrokeShader,
+ StrokeAttribute,
+ StrokeVertex,
+ StrokeVertexIterator,
+ TVertex,
+ UnaryFunction0D,
+ UnaryFunction0DDouble,
+ UnaryFunction0DEdgeNature,
+ UnaryFunction0DFloat,
+ UnaryFunction0DId,
+ UnaryFunction0DMaterial,
+ UnaryFunction0DUnsigned,
+ UnaryFunction0DVec2f,
+ UnaryFunction0DVec3f,
+ UnaryFunction0DVectorViewShape,
+ UnaryFunction0DViewShape,
+ UnaryFunction1D,
+ UnaryFunction1DDouble,
+ UnaryFunction1DEdgeNature,
+ UnaryFunction1DFloat,
+ UnaryFunction1DUnsigned,
+ UnaryFunction1DVec2f,
+ UnaryFunction1DVec3f,
+ UnaryFunction1DVectorViewShape,
+ UnaryFunction1DVoid,
+ UnaryPredicate0D,
+ UnaryPredicate1D,
+ ViewEdge,
+ ViewEdgeIterator,
+ ViewMap,
+ ViewShape,
+ ViewVertex,
+ orientedViewEdgeIterator,
+ )
diff --git a/release/scripts/freestyle/modules/freestyle/utils.py b/release/scripts/freestyle/modules/freestyle/utils.py
new file mode 100644
index 00000000000..af9cc109ae1
--- /dev/null
+++ b/release/scripts/freestyle/modules/freestyle/utils.py
@@ -0,0 +1,24 @@
+# ##### BEGIN GPL LICENSE BLOCK #####
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software Foundation,
+# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+#
+# ##### END GPL LICENSE BLOCK #####
+
+# module members
+from _freestyle import (
+ ContextFunctions,
+ getCurrentScene,
+ integrate,
+ )
diff --git a/release/scripts/freestyle/style_modules/parameter_editor.py b/release/scripts/freestyle/modules/parameter_editor.py
index 83bd8dbda5e..dd8a6aa63ee 100644
--- a/release/scripts/freestyle/style_modules/parameter_editor.py
+++ b/release/scripts/freestyle/modules/parameter_editor.py
@@ -26,6 +26,8 @@ import math
import mathutils
import time
+from _freestyle import blendRamp, evaluateColorRamp, evaluateCurveMappingF
+
from ChainingIterators import pySketchyChainSilhouetteIterator, pySketchyChainingIterator
from freestyle import BackboneStretcherShader, BezierCurveShader, BinaryPredicate1D, ChainPredicateIterator, \
ChainSilhouetteIterator, ConstantColorShader, ContourUP1D, Curvature2DAngleF0D, ExternalContourUP1D, \
diff --git a/release/scripts/freestyle/style_modules/ChainingIterators.py b/release/scripts/freestyle/style_modules/ChainingIterators.py
deleted file mode 100644
index 9be5a0ef03f..00000000000
--- a/release/scripts/freestyle/style_modules/ChainingIterators.py
+++ /dev/null
@@ -1,713 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : ChainingIterators.py
-# Author : Stephane Grabli
-# Date : 04/08/2005
-# Purpose : Chaining Iterators to be used with chaining operators
-
-from freestyle import AdjacencyIterator, ChainingIterator, ExternalContourUP1D, Nature, TVertex
-from freestyle import ContextFunctions as CF
-
-import bpy
-
-## the natural chaining iterator
-## It follows the edges of same nature following the topology of
-## objects with preseance on silhouettes, then borders,
-## then suggestive contours, then everything else. It doesn't chain the same ViewEdge twice
-## You can specify whether to stay in the selection or not.
-class pyChainSilhouetteIterator(ChainingIterator):
- def __init__(self, stayInSelection=True):
- ChainingIterator.__init__(self, stayInSelection, True, None, True)
- def init(self):
- pass
- def traverse(self, iter):
- winner = None
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- if ve.id == mateVE.id:
- winner = ve
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for i in range(len(natures)):
- currentNature = self.current_edge.nature
- if (natures[i] & currentNature) != 0:
- count=0
- while not it.is_end:
- visitNext = 0
- oNature = it.object.nature
- if (oNature & natures[i]) != 0:
- if natures[i] != oNature:
- for j in range(i):
- if (natures[j] & oNature) != 0:
- visitNext = 1
- break
- if visitNext != 0:
- break
- count = count+1
- winner = it.object
- it.increment()
- if count != 1:
- winner = None
- break
- return winner
-
-## the natural chaining iterator
-## It follows the edges of same nature on the same
-## objects with preseance on silhouettes, then borders,
-## then suggestive contours, then everything else. It doesn't chain the same ViewEdge twice
-## You can specify whether to stay in the selection or not.
-## You can specify whether to chain iterate over edges that were
-## already visited or not.
-class pyChainSilhouetteGenericIterator(ChainingIterator):
- def __init__(self, stayInSelection=True, stayInUnvisited=True):
- ChainingIterator.__init__(self, stayInSelection, stayInUnvisited, None, True)
- def init(self):
- pass
- def traverse(self, iter):
- winner = None
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- if ve.id == mateVE.id:
- winner = ve
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for i in range(len(natures)):
- currentNature = self.current_edge.nature
- if (natures[i] & currentNature) != 0:
- count=0
- while not it.is_end:
- visitNext = 0
- oNature = it.object.nature
- ve = it.object
- if ve.id == self.current_edge.id:
- it.increment()
- continue
- if (oNature & natures[i]) != 0:
- if natures[i] != oNature:
- for j in range(i):
- if (natures[j] & oNature) != 0:
- visitNext = 1
- break
- if visitNext != 0:
- break
- count = count+1
- winner = ve
- it.increment()
- if count != 1:
- winner = None
- break
- return winner
-
-class pyExternalContourChainingIterator(ChainingIterator):
- def __init__(self):
- ChainingIterator.__init__(self, False, True, None, True)
- self._isExternalContour = ExternalContourUP1D()
- def init(self):
- self._nEdges = 0
- self._isInSelection = 1
- def checkViewEdge(self, ve, orientation):
- if orientation != 0:
- vertex = ve.second_svertex()
- else:
- vertex = ve.first_svertex()
- it = AdjacencyIterator(vertex,1,1)
- while not it.is_end:
- ave = it.object
- if self._isExternalContour(ave):
- return 1
- it.increment()
- print("pyExternlContourChainingIterator : didn't find next edge")
- return 0
- def traverse(self, iter):
- winner = None
- it = AdjacencyIterator(iter)
- while not it.is_end:
- ve = it.object
- if self._isExternalContour(ve):
- if ve.time_stamp == CF.get_time_stamp():
- winner = ve
- it.increment()
-
- self._nEdges = self._nEdges+1
- if winner is None:
- orient = 1
- it = AdjacencyIterator(iter)
- while not it.is_end:
- ve = it.object
- if it.is_incoming:
- orient = 0
- good = self.checkViewEdge(ve,orient)
- if good != 0:
- winner = ve
- it.increment()
- return winner
-
-## the natural chaining iterator
-## with a sketchy multiple touch
-class pySketchyChainSilhouetteIterator(ChainingIterator):
- def __init__(self, nRounds=3,stayInSelection=True):
- ChainingIterator.__init__(self, stayInSelection, False, None, True)
- self._timeStamp = CF.get_time_stamp()+nRounds
- self._nRounds = nRounds
- def init(self):
- self._timeStamp = CF.get_time_stamp()+self._nRounds
- def traverse(self, iter):
- winner = None
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- if ve.id == mateVE.id:
- winner = ve
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for i in range(len(natures)):
- currentNature = self.current_edge.nature
- if (natures[i] & currentNature) != 0:
- count=0
- while not it.is_end:
- visitNext = 0
- oNature = it.object.nature
- ve = it.object
- if ve.id == self.current_edge.id:
- it.increment()
- continue
- if (oNature & natures[i]) != 0:
- if (natures[i] != oNature) != 0:
- for j in range(i):
- if (natures[j] & oNature) != 0:
- visitNext = 1
- break
- if visitNext != 0:
- break
- count = count+1
- winner = ve
- it.increment()
- if count != 1:
- winner = None
- break
- if winner is None:
- winner = self.current_edge
- if winner.chaining_time_stamp == self._timeStamp:
- winner = None
- return winner
-
-
-# Chaining iterator designed for sketchy style.
-# can chain several times the same ViewEdge
-# in order to produce multiple strokes per ViewEdge.
-class pySketchyChainingIterator(ChainingIterator):
- def __init__(self, nRounds=3, stayInSelection=True):
- ChainingIterator.__init__(self, stayInSelection, False, None, True)
- self._timeStamp = CF.get_time_stamp()+nRounds
- self._nRounds = nRounds
- def init(self):
- self._timeStamp = CF.get_time_stamp()+self._nRounds
- def traverse(self, iter):
- winner = None
- found = False
- it = AdjacencyIterator(iter)
- while not it.is_end:
- ve = it.object
- if ve.id == self.current_edge.id:
- found = True
- it.increment()
- continue
- winner = ve
- it.increment()
- if not found:
- # This is a fatal error condition: self.current_edge must be found
- # among the edges seen by the AdjacencyIterator [bug #35695].
- if bpy.app.debug_freestyle:
- print('pySketchyChainingIterator: current edge not found')
- return None
- if winner is None:
- winner = self.current_edge
- if winner.chaining_time_stamp == self._timeStamp:
- return None
- return winner
-
-
-## Chaining iterator that fills small occlusions
-## percent
-## The max length of the occluded part
-## expressed in % of the total chain length
-class pyFillOcclusionsRelativeChainingIterator(ChainingIterator):
- def __init__(self, percent):
- ChainingIterator.__init__(self, False, True, None, True)
- self._length = 0
- self._percent = float(percent)
- def init(self):
- # each time we're evaluating a chain length
- # we try to do it once. Thus we reinit
- # the chain length here:
- self._length = 0
- def traverse(self, iter):
- winner = None
- winnerOrientation = 0
- #print(self.current_edge.id.first, self.current_edge.id.second)
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- if ve.id == mateVE.id:
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for nat in natures:
- if (self.current_edge.nature & nat) != 0:
- count=0
- while not it.is_end:
- ve = it.object
- if (ve.nature & nat) != 0:
- count = count+1
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- it.increment()
- if count != 1:
- winner = None
- break
- if winner is not None:
- # check whether this edge was part of the selection
- if winner.time_stamp != CF.get_time_stamp():
- #print("---", winner.id.first, winner.id.second)
- # if not, let's check whether it's short enough with
- # respect to the chain made without staying in the selection
- #------------------------------------------------------------
- # Did we compute the prospective chain length already ?
- if self._length == 0:
- #if not, let's do it
- _it = pyChainSilhouetteGenericIterator(0,0)
- _it.begin = winner
- _it.current_edge = winner
- _it.orientation = winnerOrientation
- _it.init()
- while not _it.is_end:
- ve = _it.object
- #print("--------", ve.id.first, ve.id.second)
- self._length = self._length + ve.length_2d
- _it.increment()
- if _it.is_begin:
- break;
- _it.begin = winner
- _it.current_edge = winner
- _it.orientation = winnerOrientation
- if not _it.is_begin:
- _it.decrement()
- while (not _it.is_end) and (not _it.is_begin):
- ve = _it.object
- #print("--------", ve.id.first, ve.id.second)
- self._length = self._length + ve.length_2d
- _it.decrement()
-
- # let's do the comparison:
- # nw let's compute the length of this connex non selected part:
- connexl = 0
- _cit = pyChainSilhouetteGenericIterator(0,0)
- _cit.begin = winner
- _cit.current_edge = winner
- _cit.orientation = winnerOrientation
- _cit.init()
- while _cit.is_end == 0 and _cit.object.time_stamp != CF.get_time_stamp():
- ve = _cit.object
- #print("-------- --------", ve.id.first, ve.id.second)
- connexl = connexl + ve.length_2d
- _cit.increment()
- if connexl > self._percent * self._length:
- winner = None
- return winner
-
-## Chaining iterator that fills small occlusions
-## size
-## The max length of the occluded part
-## expressed in pixels
-class pyFillOcclusionsAbsoluteChainingIterator(ChainingIterator):
- def __init__(self, length):
- ChainingIterator.__init__(self, False, True, None, True)
- self._length = float(length)
- def init(self):
- pass
- def traverse(self, iter):
- winner = None
- winnerOrientation = 0
- #print(self.current_edge.id.first, self.current_edge.id.second)
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- if ve.id == mateVE.id:
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for nat in natures:
- if (self.current_edge.nature & nat) != 0:
- count=0
- while not it.is_end:
- ve = it.object
- if (ve.nature & nat) != 0:
- count = count+1
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- it.increment()
- if count != 1:
- winner = None
- break
- if winner is not None:
- # check whether this edge was part of the selection
- if winner.time_stamp != CF.get_time_stamp():
- #print("---", winner.id.first, winner.id.second)
- # nw let's compute the length of this connex non selected part:
- connexl = 0
- _cit = pyChainSilhouetteGenericIterator(0,0)
- _cit.begin = winner
- _cit.current_edge = winner
- _cit.orientation = winnerOrientation
- _cit.init()
- while _cit.is_end == 0 and _cit.object.time_stamp != CF.get_time_stamp():
- ve = _cit.object
- #print("-------- --------", ve.id.first, ve.id.second)
- connexl = connexl + ve.length_2d
- _cit.increment()
- if connexl > self._length:
- winner = None
- return winner
-
-
-## Chaining iterator that fills small occlusions
-## percent
-## The max length of the occluded part
-## expressed in % of the total chain length
-class pyFillOcclusionsAbsoluteAndRelativeChainingIterator(ChainingIterator):
- def __init__(self, percent, l):
- ChainingIterator.__init__(self, False, True, None, True)
- self._length = 0
- self._absLength = l
- self._percent = float(percent)
- def init(self):
- # each time we're evaluating a chain length
- # we try to do it once. Thus we reinit
- # the chain length here:
- self._length = 0
- def traverse(self, iter):
- winner = None
- winnerOrientation = 0
- #print(self.current_edge.id.first, self.current_edge.id.second)
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- if ve.id == mateVE.id:
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for nat in natures:
- if (self.current_edge.nature & nat) != 0:
- count=0
- while not it.is_end:
- ve = it.object
- if (ve.nature & nat) != 0:
- count = count+1
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- it.increment()
- if count != 1:
- winner = None
- break
- if winner is not None:
- # check whether this edge was part of the selection
- if winner.time_stamp != CF.get_time_stamp():
- #print("---", winner.id.first, winner.id.second)
- # if not, let's check whether it's short enough with
- # respect to the chain made without staying in the selection
- #------------------------------------------------------------
- # Did we compute the prospective chain length already ?
- if self._length == 0:
- #if not, let's do it
- _it = pyChainSilhouetteGenericIterator(0,0)
- _it.begin = winner
- _it.current_edge = winner
- _it.orientation = winnerOrientation
- _it.init()
- while not _it.is_end:
- ve = _it.object
- #print("--------", ve.id.first, ve.id.second)
- self._length = self._length + ve.length_2d
- _it.increment()
- if _it.is_begin:
- break;
- _it.begin = winner
- _it.current_edge = winner
- _it.orientation = winnerOrientation
- if not _it.is_begin:
- _it.decrement()
- while (not _it.is_end) and (not _it.is_begin):
- ve = _it.object
- #print("--------", ve.id.first, ve.id.second)
- self._length = self._length + ve.length_2d
- _it.decrement()
-
- # let's do the comparison:
- # nw let's compute the length of this connex non selected part:
- connexl = 0
- _cit = pyChainSilhouetteGenericIterator(0,0)
- _cit.begin = winner
- _cit.current_edge = winner
- _cit.orientation = winnerOrientation
- _cit.init()
- while _cit.is_end == 0 and _cit.object.time_stamp != CF.get_time_stamp():
- ve = _cit.object
- #print("-------- --------", ve.id.first, ve.id.second)
- connexl = connexl + ve.length_2d
- _cit.increment()
- if (connexl > self._percent * self._length) or (connexl > self._absLength):
- winner = None
- return winner
-
-## Chaining iterator that fills small occlusions without caring about the
-## actual selection
-## percent
-## The max length of the occluded part
-## expressed in % of the total chain length
-class pyFillQi0AbsoluteAndRelativeChainingIterator(ChainingIterator):
- def __init__(self, percent, l):
- ChainingIterator.__init__(self, False, True, None, True)
- self._length = 0
- self._absLength = l
- self._percent = float(percent)
- def init(self):
- # each time we're evaluating a chain length
- # we try to do it once. Thus we reinit
- # the chain length here:
- self._length = 0
- def traverse(self, iter):
- winner = None
- winnerOrientation = 0
- #print(self.current_edge.id.first, self.current_edge.id.second)
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- if ve.id == mateVE.id:
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for nat in natures:
- if (self.current_edge.nature & nat) != 0:
- count=0
- while not it.is_end:
- ve = it.object
- if (ve.nature & nat) != 0:
- count = count+1
- winner = ve
- if not it.is_incoming:
- winnerOrientation = 1
- else:
- winnerOrientation = 0
- it.increment()
- if count != 1:
- winner = None
- break
- if winner is not None:
- # check whether this edge was part of the selection
- if winner.qi != 0:
- #print("---", winner.id.first, winner.id.second)
- # if not, let's check whether it's short enough with
- # respect to the chain made without staying in the selection
- #------------------------------------------------------------
- # Did we compute the prospective chain length already ?
- if self._length == 0:
- #if not, let's do it
- _it = pyChainSilhouetteGenericIterator(0,0)
- _it.begin = winner
- _it.current_edge = winner
- _it.orientation = winnerOrientation
- _it.init()
- while not _it.is_end:
- ve = _it.object
- #print("--------", ve.id.first, ve.id.second)
- self._length = self._length + ve.length_2d
- _it.increment()
- if _it.is_begin:
- break;
- _it.begin = winner
- _it.current_edge = winner
- _it.orientation = winnerOrientation
- if not _it.is_begin:
- _it.decrement()
- while (not _it.is_end) and (not _it.is_begin):
- ve = _it.object
- #print("--------", ve.id.first, ve.id.second)
- self._length = self._length + ve.length_2d
- _it.decrement()
-
- # let's do the comparison:
- # nw let's compute the length of this connex non selected part:
- connexl = 0
- _cit = pyChainSilhouetteGenericIterator(0,0)
- _cit.begin = winner
- _cit.current_edge = winner
- _cit.orientation = winnerOrientation
- _cit.init()
- while not _cit.is_end and _cit.object.qi != 0:
- ve = _cit.object
- #print("-------- --------", ve.id.first, ve.id.second)
- connexl = connexl + ve.length_2d
- _cit.increment()
- if (connexl > self._percent * self._length) or (connexl > self._absLength):
- winner = None
- return winner
-
-
-## the natural chaining iterator
-## It follows the edges of same nature on the same
-## objects with preseance on silhouettes, then borders,
-## then suggestive contours, then everything else. It doesn't chain the same ViewEdge twice
-## You can specify whether to stay in the selection or not.
-class pyNoIdChainSilhouetteIterator(ChainingIterator):
- def __init__(self, stayInSelection=True):
- ChainingIterator.__init__(self, stayInSelection, True, None, True)
- def init(self):
- pass
- def traverse(self, iter):
- winner = None
- it = AdjacencyIterator(iter)
- tvertex = self.next_vertex
- if type(tvertex) is TVertex:
- mateVE = tvertex.get_mate(self.current_edge)
- while not it.is_end:
- ve = it.object
- feB = self.current_edge.last_fedge
- feA = ve.first_fedge
- vB = feB.second_svertex
- vA = feA.first_svertex
- if vA.id.first == vB.id.first:
- winner = ve
- break
- feA = self.current_edge.first_fedge
- feB = ve.last_fedge
- vB = feB.second_svertex
- vA = feA.first_svertex
- if vA.id.first == vB.id.first:
- winner = ve
- break
- feA = self.current_edge.last_fedge
- feB = ve.last_fedge
- vB = feB.second_svertex
- vA = feA.second_svertex
- if vA.id.first == vB.id.first:
- winner = ve
- break
- feA = self.current_edge.first_fedge
- feB = ve.first_fedge
- vB = feB.first_svertex
- vA = feA.first_svertex
- if vA.id.first == vB.id.first:
- winner = ve
- break
- it.increment()
- else:
- ## case of NonTVertex
- natures = [Nature.SILHOUETTE,Nature.BORDER,Nature.CREASE,Nature.SUGGESTIVE_CONTOUR,Nature.VALLEY,Nature.RIDGE]
- for i in range(len(natures)):
- currentNature = self.current_edge.nature
- if (natures[i] & currentNature) != 0:
- count=0
- while not it.is_end:
- visitNext = 0
- oNature = it.object.nature
- if (oNature & natures[i]) != 0:
- if natures[i] != oNature:
- for j in range(i):
- if (natures[j] & oNature) != 0:
- visitNext = 1
- break
- if visitNext != 0:
- break
- count = count+1
- winner = it.object
- it.increment()
- if count != 1:
- winner = None
- break
- return winner
-
diff --git a/release/scripts/freestyle/style_modules/Functions0D.py b/release/scripts/freestyle/style_modules/Functions0D.py
deleted file mode 100644
index eac168a3b6d..00000000000
--- a/release/scripts/freestyle/style_modules/Functions0D.py
+++ /dev/null
@@ -1,105 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : Functions0D.py
-# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
-# Date : 30/06/2005
-# Purpose : Functions (functors) to be used for 0D elements
-
-from freestyle import Curvature2DAngleF0D, CurvePoint, ReadCompleteViewMapPixelF0D, \
- ReadSteerableViewMapPixelF0D, UnaryFunction0DDouble, UnaryFunction0DMaterial, \
- UnaryFunction0DVec2f
-from freestyle import ContextFunctions as CF
-
-import math
-import mathutils
-
-class CurveMaterialF0D(UnaryFunction0DMaterial):
- # A replacement of the built-in MaterialF0D for stroke creation.
- # MaterialF0D does not work with Curves and Strokes.
- def __call__(self, inter):
- cp = inter.object
- assert(isinstance(cp, CurvePoint))
- fe = cp.first_svertex.get_fedge(cp.second_svertex)
- assert(fe is not None)
- return fe.material if fe.is_smooth else fe.material_left
-
-class pyInverseCurvature2DAngleF0D(UnaryFunction0DDouble):
- def __call__(self, inter):
- func = Curvature2DAngleF0D()
- c = func(inter)
- return (3.1415 - c)
-
-class pyCurvilinearLengthF0D(UnaryFunction0DDouble):
- def __call__(self, inter):
- cp = inter.object
- assert(isinstance(cp, CurvePoint))
- return cp.t2d
-
-## estimate anisotropy of density
-class pyDensityAnisotropyF0D(UnaryFunction0DDouble):
- def __init__(self,level):
- UnaryFunction0DDouble.__init__(self)
- self.IsoDensity = ReadCompleteViewMapPixelF0D(level)
- self.d0Density = ReadSteerableViewMapPixelF0D(0, level)
- self.d1Density = ReadSteerableViewMapPixelF0D(1, level)
- self.d2Density = ReadSteerableViewMapPixelF0D(2, level)
- self.d3Density = ReadSteerableViewMapPixelF0D(3, level)
- def __call__(self, inter):
- c_iso = self.IsoDensity(inter)
- c_0 = self.d0Density(inter)
- c_1 = self.d1Density(inter)
- c_2 = self.d2Density(inter)
- c_3 = self.d3Density(inter)
- cMax = max(max(c_0,c_1), max(c_2,c_3))
- cMin = min(min(c_0,c_1), min(c_2,c_3))
- if c_iso == 0:
- v = 0
- else:
- v = (cMax-cMin)/c_iso
- return v
-
-## Returns the gradient vector for a pixel
-## l
-## the level at which one wants to compute the gradient
-class pyViewMapGradientVectorF0D(UnaryFunction0DVec2f):
- def __init__(self, l):
- UnaryFunction0DVec2f.__init__(self)
- self._l = l
- self._step = math.pow(2,self._l)
- def __call__(self, iter):
- p = iter.object.point_2d
- gx = CF.read_complete_view_map_pixel(self._l, int(p.x+self._step), int(p.y)) - \
- CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
- gy = CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y+self._step)) - \
- CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
- return mathutils.Vector([gx, gy])
-
-class pyViewMapGradientNormF0D(UnaryFunction0DDouble):
- def __init__(self, l):
- UnaryFunction0DDouble.__init__(self)
- self._l = l
- self._step = math.pow(2,self._l)
- def __call__(self, iter):
- p = iter.object.point_2d
- gx = CF.read_complete_view_map_pixel(self._l, int(p.x+self._step), int(p.y)) - \
- CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
- gy = CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y+self._step)) - \
- CF.read_complete_view_map_pixel(self._l, int(p.x), int(p.y))
- grad = mathutils.Vector([gx, gy])
- return grad.length
diff --git a/release/scripts/freestyle/style_modules/Functions1D.py b/release/scripts/freestyle/style_modules/Functions1D.py
deleted file mode 100644
index 3e74b1817eb..00000000000
--- a/release/scripts/freestyle/style_modules/Functions1D.py
+++ /dev/null
@@ -1,58 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : Functions1D.py
-# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
-# Date : 08/04/2005
-# Purpose : Functions (functors) to be used for 1D elements
-
-from freestyle import GetProjectedZF1D, IntegrationType, UnaryFunction1DDouble, integrate
-from Functions0D import pyDensityAnisotropyF0D, pyViewMapGradientNormF0D
-import string
-
-class pyGetInverseProjectedZF1D(UnaryFunction1DDouble):
- def __call__(self, inter):
- func = GetProjectedZF1D()
- z = func(inter)
- return (1.0 - z)
-
-class pyGetSquareInverseProjectedZF1D(UnaryFunction1DDouble):
- def __call__(self, inter):
- func = GetProjectedZF1D()
- z = func(inter)
- return (1.0 - z*z)
-
-class pyDensityAnisotropyF1D(UnaryFunction1DDouble):
- def __init__(self,level, integrationType=IntegrationType.MEAN, sampling=2.0):
- UnaryFunction1DDouble.__init__(self, integrationType)
- self._func = pyDensityAnisotropyF0D(level)
- self._integration = integrationType
- self._sampling = sampling
- def __call__(self, inter):
- v = integrate(self._func, inter.pointsBegin(self._sampling), inter.pointsEnd(self._sampling), self._integration)
- return v
-
-class pyViewMapGradientNormF1D(UnaryFunction1DDouble):
- def __init__(self,l, integrationType, sampling=2.0):
- UnaryFunction1DDouble.__init__(self, integrationType)
- self._func = pyViewMapGradientNormF0D(l)
- self._integration = integrationType
- self._sampling = sampling
- def __call__(self, inter):
- v = integrate(self._func, inter.pointsBegin(self._sampling), inter.pointsEnd(self._sampling), self._integration)
- return v
diff --git a/release/scripts/freestyle/style_modules/PredicatesB1D.py b/release/scripts/freestyle/style_modules/PredicatesB1D.py
deleted file mode 100644
index 604692a999f..00000000000
--- a/release/scripts/freestyle/style_modules/PredicatesB1D.py
+++ /dev/null
@@ -1,73 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : PredicatesB1D.py
-# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
-# Date : 08/04/2005
-# Purpose : Binary predicates (functors) to be used for 1D elements
-
-from freestyle import BinaryPredicate1D, GetZF1D, IntegrationType, Nature, SameShapeIdBP1D, ZDiscontinuityF1D
-from Functions1D import pyViewMapGradientNormF1D
-
-import random
-
-class pyZBP1D(BinaryPredicate1D):
- def __call__(self, i1, i2):
- func = GetZF1D()
- return (func(i1) > func(i2))
-
-class pyZDiscontinuityBP1D(BinaryPredicate1D):
- def __init__(self, iType = IntegrationType.MEAN):
- BinaryPredicate1D.__init__(self)
- self._GetZDiscontinuity = ZDiscontinuityF1D(iType)
- def __call__(self, i1, i2):
- return (self._GetZDiscontinuity(i1) > self._GetZDiscontinuity(i2))
-
-class pyLengthBP1D(BinaryPredicate1D):
- def __call__(self, i1, i2):
- return (i1.length_2d > i2.length_2d)
-
-class pySilhouetteFirstBP1D(BinaryPredicate1D):
- def __call__(self, inter1, inter2):
- bpred = SameShapeIdBP1D()
- if (bpred(inter1, inter2) != 1):
- return 0
- if (inter1.nature & Nature.SILHOUETTE):
- return (inter2.nature & Nature.SILHOUETTE) != 0
- return (inter1.nature == inter2.nature)
-
-class pyNatureBP1D(BinaryPredicate1D):
- def __call__(self, inter1, inter2):
- return (inter1.nature & inter2.nature)
-
-class pyViewMapGradientNormBP1D(BinaryPredicate1D):
- def __init__(self,l, sampling=2.0):
- BinaryPredicate1D.__init__(self)
- self._GetGradient = pyViewMapGradientNormF1D(l, IntegrationType.MEAN)
- def __call__(self, i1,i2):
- #print("compare gradient")
- return (self._GetGradient(i1) > self._GetGradient(i2))
-
-class pyShuffleBP1D(BinaryPredicate1D):
- def __init__(self):
- BinaryPredicate1D.__init__(self)
- random.seed(1)
- def __call__(self, inter1, inter2):
- r1 = random.uniform(0,1)
- r2 = random.uniform(0,1)
- return (r1<r2)
diff --git a/release/scripts/freestyle/style_modules/PredicatesU0D.py b/release/scripts/freestyle/style_modules/PredicatesU0D.py
deleted file mode 100644
index 7119740ce49..00000000000
--- a/release/scripts/freestyle/style_modules/PredicatesU0D.py
+++ /dev/null
@@ -1,96 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : PredicatesU0D.py
-# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
-# Date : 08/04/2005
-# Purpose : Unary predicates (functors) to be used for 0D elements
-
-from freestyle import Curvature2DAngleF0D, Nature, QuantitativeInvisibilityF0D, UnaryPredicate0D
-from Functions0D import pyCurvilinearLengthF0D
-
-class pyHigherCurvature2DAngleUP0D(UnaryPredicate0D):
- def __init__(self,a):
- UnaryPredicate0D.__init__(self)
- self._a = a
- def __call__(self, inter):
- func = Curvature2DAngleF0D()
- a = func(inter)
- return (a > self._a)
-
-class pyUEqualsUP0D(UnaryPredicate0D):
- def __init__(self,u, w):
- UnaryPredicate0D.__init__(self)
- self._u = u
- self._w = w
- def __call__(self, inter):
- func = pyCurvilinearLengthF0D()
- u = func(inter)
- return (u > (self._u-self._w)) and (u < (self._u+self._w))
-
-class pyVertexNatureUP0D(UnaryPredicate0D):
- def __init__(self,nature):
- UnaryPredicate0D.__init__(self)
- self._nature = nature
- def __call__(self, inter):
- v = inter.object
- return (v.nature & self._nature) != 0
-
-## check whether an Interface0DIterator
-## is a TVertex and is the one that is
-## hidden (inferred from the context)
-class pyBackTVertexUP0D(UnaryPredicate0D):
- def __init__(self):
- UnaryPredicate0D.__init__(self)
- self._getQI = QuantitativeInvisibilityF0D()
- def __call__(self, iter):
- if (iter.object.nature & Nature.T_VERTEX) == 0:
- return 0
- if iter.is_end:
- return 0
- if self._getQI(iter) != 0:
- return 1
- return 0
-
-class pyParameterUP0DGoodOne(UnaryPredicate0D):
- def __init__(self,pmin,pmax):
- UnaryPredicate0D.__init__(self)
- self._m = pmin
- self._M = pmax
- #self.getCurvilinearAbscissa = GetCurvilinearAbscissaF0D()
- def __call__(self, inter):
- #s = self.getCurvilinearAbscissa(inter)
- u = inter.u
- #print(u)
- return ((u>=self._m) and (u<=self._M))
-
-class pyParameterUP0D(UnaryPredicate0D):
- def __init__(self,pmin,pmax):
- UnaryPredicate0D.__init__(self)
- self._m = pmin
- self._M = pmax
- #self.getCurvilinearAbscissa = GetCurvilinearAbscissaF0D()
- def __call__(self, inter):
- func = Curvature2DAngleF0D()
- c = func(inter)
- b1 = (c>0.1)
- #s = self.getCurvilinearAbscissa(inter)
- u = inter.u
- #print(u)
- b = ((u>=self._m) and (u<=self._M))
- return b and b1
diff --git a/release/scripts/freestyle/style_modules/PredicatesU1D.py b/release/scripts/freestyle/style_modules/PredicatesU1D.py
deleted file mode 100644
index 54f920d0563..00000000000
--- a/release/scripts/freestyle/style_modules/PredicatesU1D.py
+++ /dev/null
@@ -1,342 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : PredicatesU1D.py
-# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
-# Date : 08/04/2005
-# Purpose : Unary predicates (functors) to be used for 1D elements
-
-from freestyle import Curvature2DAngleF0D, CurveNatureF1D, DensityF1D, GetCompleteViewMapDensityF1D, \
- GetDirectionalViewMapDensityF1D, GetOccludersF1D, GetProjectedZF1D, GetShapeF1D, GetSteerableViewMapDensityF1D, \
- IntegrationType, ShapeUP1D, TVertex, UnaryPredicate1D
-from Functions1D import pyDensityAnisotropyF1D, pyViewMapGradientNormF1D
-
-class pyNFirstUP1D(UnaryPredicate1D):
- def __init__(self, n):
- UnaryPredicate1D.__init__(self)
- self.__n = n
- self.__count = 0
- def __call__(self, inter):
- self.__count = self.__count + 1
- if self.__count <= self.__n:
- return 1
- return 0
-
-class pyHigherLengthUP1D(UnaryPredicate1D):
- def __init__(self,l):
- UnaryPredicate1D.__init__(self)
- self._l = l
- def __call__(self, inter):
- return (inter.length_2d > self._l)
-
-class pyNatureUP1D(UnaryPredicate1D):
- def __init__(self,nature):
- UnaryPredicate1D.__init__(self)
- self._nature = nature
- self._getNature = CurveNatureF1D()
- def __call__(self, inter):
- if(self._getNature(inter) & self._nature):
- return 1
- return 0
-
-class pyHigherNumberOfTurnsUP1D(UnaryPredicate1D):
- def __init__(self,n,a):
- UnaryPredicate1D.__init__(self)
- self._n = n
- self._a = a
- def __call__(self, inter):
- count = 0
- func = Curvature2DAngleF0D()
- it = inter.vertices_begin()
- while not it.is_end:
- if func(it) > self._a:
- count = count+1
- if count > self._n:
- return 1
- it.increment()
- return 0
-
-class pyDensityUP1D(UnaryPredicate1D):
- def __init__(self,wsize,threshold, integration = IntegrationType.MEAN, sampling=2.0):
- UnaryPredicate1D.__init__(self)
- self._wsize = wsize
- self._threshold = threshold
- self._integration = integration
- self._func = DensityF1D(self._wsize, self._integration, sampling)
- def __call__(self, inter):
- if self._func(inter) < self._threshold:
- return 1
- return 0
-
-class pyLowSteerableViewMapDensityUP1D(UnaryPredicate1D):
- def __init__(self,threshold, level,integration = IntegrationType.MEAN):
- UnaryPredicate1D.__init__(self)
- self._threshold = threshold
- self._level = level
- self._integration = integration
- def __call__(self, inter):
- func = GetSteerableViewMapDensityF1D(self._level, self._integration)
- v = func(inter)
- #print(v)
- if v < self._threshold:
- return 1
- return 0
-
-class pyLowDirectionalViewMapDensityUP1D(UnaryPredicate1D):
- def __init__(self,threshold, orientation, level,integration = IntegrationType.MEAN):
- UnaryPredicate1D.__init__(self)
- self._threshold = threshold
- self._orientation = orientation
- self._level = level
- self._integration = integration
- def __call__(self, inter):
- func = GetDirectionalViewMapDensityF1D(self._orientation, self._level, self._integration)
- v = func(inter)
- #print(v)
- if v < self._threshold:
- return 1
- return 0
-
-class pyHighSteerableViewMapDensityUP1D(UnaryPredicate1D):
- def __init__(self,threshold, level,integration = IntegrationType.MEAN):
- UnaryPredicate1D.__init__(self)
- self._threshold = threshold
- self._level = level
- self._integration = integration
- self._func = GetSteerableViewMapDensityF1D(self._level, self._integration)
- def __call__(self, inter):
- v = self._func(inter)
- if v > self._threshold:
- return 1
- return 0
-
-class pyHighDirectionalViewMapDensityUP1D(UnaryPredicate1D):
- def __init__(self,threshold, orientation, level,integration = IntegrationType.MEAN, sampling=2.0):
- UnaryPredicate1D.__init__(self)
- self._threshold = threshold
- self._orientation = orientation
- self._level = level
- self._integration = integration
- self._sampling = sampling
- def __call__(self, inter):
- func = GetDirectionalViewMapDensityF1D(self._orientation, self._level, self._integration, self._sampling)
- v = func(inter)
- if v > self._threshold:
- return 1
- return 0
-
-class pyHighViewMapDensityUP1D(UnaryPredicate1D):
- def __init__(self,threshold, level,integration = IntegrationType.MEAN, sampling=2.0):
- UnaryPredicate1D.__init__(self)
- self._threshold = threshold
- self._level = level
- self._integration = integration
- self._sampling = sampling
- self._func = GetCompleteViewMapDensityF1D(self._level, self._integration, self._sampling) # 2.0 is the smpling
- def __call__(self, inter):
- #print("toto")
- #print(func.name)
- #print(inter.name)
- v= self._func(inter)
- if v > self._threshold:
- return 1
- return 0
-
-class pyDensityFunctorUP1D(UnaryPredicate1D):
- def __init__(self,wsize,threshold, functor, funcmin=0.0, funcmax=1.0, integration = IntegrationType.MEAN):
- UnaryPredicate1D.__init__(self)
- self._wsize = wsize
- self._threshold = float(threshold)
- self._functor = functor
- self._funcmin = float(funcmin)
- self._funcmax = float(funcmax)
- self._integration = integration
- def __call__(self, inter):
- func = DensityF1D(self._wsize, self._integration)
- res = self._functor(inter)
- k = (res-self._funcmin)/(self._funcmax-self._funcmin)
- if func(inter) < self._threshold*k:
- return 1
- return 0
-
-class pyZSmallerUP1D(UnaryPredicate1D):
- def __init__(self,z, integration=IntegrationType.MEAN):
- UnaryPredicate1D.__init__(self)
- self._z = z
- self._integration = integration
- def __call__(self, inter):
- func = GetProjectedZF1D(self._integration)
- if func(inter) < self._z:
- return 1
- return 0
-
-class pyIsOccludedByUP1D(UnaryPredicate1D):
- def __init__(self,id):
- UnaryPredicate1D.__init__(self)
- self._id = id
- def __call__(self, inter):
- func = GetShapeF1D()
- shapes = func(inter)
- for s in shapes:
- if(s.id == self._id):
- return 0
- it = inter.vertices_begin()
- itlast = inter.vertices_end()
- itlast.decrement()
- v = it.object
- vlast = itlast.object
- tvertex = v.viewvertex
- if type(tvertex) is TVertex:
- #print("TVertex: [ ", tvertex.id.first, ",", tvertex.id.second," ]")
- eit = tvertex.edges_begin()
- while not eit.is_end:
- ve, incoming = eit.object
- if ve.id == self._id:
- return 1
- #print("-------", ve.id.first, "-", ve.id.second)
- eit.increment()
- tvertex = vlast.viewvertex
- if type(tvertex) is TVertex:
- #print("TVertex: [ ", tvertex.id.first, ",", tvertex.id.second," ]")
- eit = tvertex.edges_begin()
- while not eit.is_end:
- ve, incoming = eit.object
- if ve.id == self._id:
- return 1
- #print("-------", ve.id.first, "-", ve.id.second)
- eit.increment()
- return 0
-
-class pyIsInOccludersListUP1D(UnaryPredicate1D):
- def __init__(self,id):
- UnaryPredicate1D.__init__(self)
- self._id = id
- def __call__(self, inter):
- func = GetOccludersF1D()
- occluders = func(inter)
- for a in occluders:
- if a.id == self._id:
- return 1
- return 0
-
-class pyIsOccludedByItselfUP1D(UnaryPredicate1D):
- def __init__(self):
- UnaryPredicate1D.__init__(self)
- self.__func1 = GetOccludersF1D()
- self.__func2 = GetShapeF1D()
- def __call__(self, inter):
- lst1 = self.__func1(inter)
- lst2 = self.__func2(inter)
- for vs1 in lst1:
- for vs2 in lst2:
- if vs1.id == vs2.id:
- return 1
- return 0
-
-class pyIsOccludedByIdListUP1D(UnaryPredicate1D):
- def __init__(self, idlist):
- UnaryPredicate1D.__init__(self)
- self._idlist = idlist
- self.__func1 = GetOccludersF1D()
- def __call__(self, inter):
- lst1 = self.__func1(inter)
- for vs1 in lst1:
- for _id in self._idlist:
- if vs1.id == _id:
- return 1
- return 0
-
-class pyShapeIdListUP1D(UnaryPredicate1D):
- def __init__(self,idlist):
- UnaryPredicate1D.__init__(self)
- self._idlist = idlist
- self._funcs = []
- for _id in idlist :
- self._funcs.append(ShapeUP1D(_id.first, _id.second))
- def __call__(self, inter):
- for func in self._funcs :
- if func(inter) == 1:
- return 1
- return 0
-
-## deprecated
-class pyShapeIdUP1D(UnaryPredicate1D):
- def __init__(self, _id):
- UnaryPredicate1D.__init__(self)
- self._id = _id
- def __call__(self, inter):
- func = GetShapeF1D()
- shapes = func(inter)
- for a in shapes:
- if a.id == self._id:
- return 1
- return 0
-
-class pyHighDensityAnisotropyUP1D(UnaryPredicate1D):
- def __init__(self,threshold, level, sampling=2.0):
- UnaryPredicate1D.__init__(self)
- self._l = threshold
- self.func = pyDensityAnisotropyF1D(level, IntegrationType.MEAN, sampling)
- def __call__(self, inter):
- return (self.func(inter) > self._l)
-
-class pyHighViewMapGradientNormUP1D(UnaryPredicate1D):
- def __init__(self,threshold, l, sampling=2.0):
- UnaryPredicate1D.__init__(self)
- self._threshold = threshold
- self._GetGradient = pyViewMapGradientNormF1D(l, IntegrationType.MEAN)
- def __call__(self, inter):
- gn = self._GetGradient(inter)
- #print(gn)
- return (gn > self._threshold)
-
-class pyDensityVariableSigmaUP1D(UnaryPredicate1D):
- def __init__(self,functor, sigmaMin,sigmaMax, lmin, lmax, tmin, tmax, integration = IntegrationType.MEAN, sampling=2.0):
- UnaryPredicate1D.__init__(self)
- self._functor = functor
- self._sigmaMin = float(sigmaMin)
- self._sigmaMax = float(sigmaMax)
- self._lmin = float(lmin)
- self._lmax = float(lmax)
- self._tmin = tmin
- self._tmax = tmax
- self._integration = integration
- self._sampling = sampling
- def __call__(self, inter):
- sigma = (self._sigmaMax-self._sigmaMin)/(self._lmax-self._lmin)*(self._functor(inter)-self._lmin) + self._sigmaMin
- t = (self._tmax-self._tmin)/(self._lmax-self._lmin)*(self._functor(inter)-self._lmin) + self._tmin
- if sigma < self._sigmaMin:
- sigma = self._sigmaMin
- self._func = DensityF1D(sigma, self._integration, self._sampling)
- d = self._func(inter)
- if d < t:
- return 1
- return 0
-
-class pyClosedCurveUP1D(UnaryPredicate1D):
- def __call__(self, inter):
- it = inter.vertices_begin()
- itlast = inter.vertices_end()
- itlast.decrement()
- vlast = itlast.object
- v = it.object
- #print(v.id.first, v.id.second)
- #print(vlast.id.first, vlast.id.second)
- if v.id == vlast.id:
- return 1
- return 0
diff --git a/release/scripts/freestyle/style_modules/logical_operators.py b/release/scripts/freestyle/style_modules/logical_operators.py
deleted file mode 100644
index 0169ab7f2a7..00000000000
--- a/release/scripts/freestyle/style_modules/logical_operators.py
+++ /dev/null
@@ -1,47 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : logical_operators.py
-# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
-# Date : 08/04/2005
-# Purpose : Logical unary predicates (functors) for 1D elements
-
-from freestyle import UnaryPredicate1D
-
-class AndUP1D(UnaryPredicate1D):
- def __init__(self, pred1, pred2):
- UnaryPredicate1D.__init__(self)
- self.__pred1 = pred1
- self.__pred2 = pred2
- def __call__(self, inter):
- return self.__pred1(inter) and self.__pred2(inter)
-
-class OrUP1D(UnaryPredicate1D):
- def __init__(self, pred1, pred2):
- UnaryPredicate1D.__init__(self)
- self.__pred1 = pred1
- self.__pred2 = pred2
- def __call__(self, inter):
- return self.__pred1(inter) or self.__pred2(inter)
-
-class NotUP1D(UnaryPredicate1D):
- def __init__(self, pred):
- UnaryPredicate1D.__init__(self)
- self.__pred = pred
- def __call__(self, inter):
- return not self.__pred(inter)
diff --git a/release/scripts/freestyle/style_modules/shaders.py b/release/scripts/freestyle/style_modules/shaders.py
deleted file mode 100644
index 01bfac4d074..00000000000
--- a/release/scripts/freestyle/style_modules/shaders.py
+++ /dev/null
@@ -1,1227 +0,0 @@
-# ##### BEGIN GPL LICENSE BLOCK #####
-#
-# This program is free software; you can redistribute it and/or
-# modify it under the terms of the GNU General Public License
-# as published by the Free Software Foundation; either version 2
-# of the License, or (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software Foundation,
-# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-#
-# ##### END GPL LICENSE BLOCK #####
-
-# Filename : shaders.py
-# Authors : Fredo Durand, Stephane Grabli, Francois Sillion, Emmanuel Turquin
-# Date : 11/08/2005
-# Purpose : Stroke shaders to be used for creation of stylized strokes
-
-from freestyle import AdjacencyIterator, Curvature2DAngleF0D, DensityF0D, GetProjectedZF0D, \
- Interface0DIterator, MaterialF0D, Nature, Noise, Normal2DF0D, Orientation2DF1D, \
- StrokeAttribute, StrokeShader, StrokeVertexIterator, ZDiscontinuityF0D
-from freestyle import ContextFunctions as CF
-from PredicatesU0D import pyVertexNatureUP0D
-
-import math
-import mathutils
-import random
-
-## thickness modifiers
-######################
-
-class pyDepthDiscontinuityThicknessShader(StrokeShader):
- def __init__(self, min, max):
- StrokeShader.__init__(self)
- self.__min = float(min)
- self.__max = float(max)
- self.__func = ZDiscontinuityF0D()
- def shade(self, stroke):
- z_min=0.0
- z_max=1.0
- a = (self.__max - self.__min)/(z_max-z_min)
- b = (self.__min*z_max-self.__max*z_min)/(z_max-z_min)
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- z = self.__func(Interface0DIterator(it))
- thickness = a*z+b
- it.object.attribute.thickness = (thickness, thickness)
- it.increment()
-
-class pyConstantThicknessShader(StrokeShader):
- def __init__(self, thickness):
- StrokeShader.__init__(self)
- self._thickness = thickness
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- t = self._thickness/2.0
- it.object.attribute.thickness = (t, t)
- it.increment()
-
-class pyFXSVaryingThicknessWithDensityShader(StrokeShader):
- def __init__(self, wsize, threshold_min, threshold_max, thicknessMin, thicknessMax):
- StrokeShader.__init__(self)
- self.wsize= wsize
- self.threshold_min= threshold_min
- self.threshold_max= threshold_max
- self._thicknessMin = thicknessMin
- self._thicknessMax = thicknessMax
- def shade(self, stroke):
- n = stroke.stroke_vertices_size()
- i = 0
- it = stroke.stroke_vertices_begin()
- func = DensityF0D(self.wsize)
- while not it.is_end:
- c = func(Interface0DIterator(it))
- if c < self.threshold_min:
- c = self.threshold_min
- if c > self.threshold_max:
- c = self.threshold_max
-## t = (c - self.threshold_min)/(self.threshold_max - self.threshold_min)*(self._thicknessMax-self._thicknessMin) + self._thicknessMin
- t = (self.threshold_max - c )/(self.threshold_max - self.threshold_min)*(self._thicknessMax-self._thicknessMin) + self._thicknessMin
- it.object.attribute.thickness = (t/2.0, t/2.0)
- i = i+1
- it.increment()
-
-class pyIncreasingThicknessShader(StrokeShader):
- def __init__(self, thicknessMin, thicknessMax):
- StrokeShader.__init__(self)
- self._thicknessMin = thicknessMin
- self._thicknessMax = thicknessMax
- def shade(self, stroke):
- n = stroke.stroke_vertices_size()
- i = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- c = float(i)/float(n)
- if i < float(n)/2.0:
- t = (1.0 - c)*self._thicknessMin + c * self._thicknessMax
- else:
- t = (1.0 - c)*self._thicknessMax + c * self._thicknessMin
- it.object.attribute.thickness = (t/2.0, t/2.0)
- i = i+1
- it.increment()
-
-class pyConstrainedIncreasingThicknessShader(StrokeShader):
- def __init__(self, thicknessMin, thicknessMax, ratio):
- StrokeShader.__init__(self)
- self._thicknessMin = thicknessMin
- self._thicknessMax = thicknessMax
- self._ratio = ratio
- def shade(self, stroke):
- slength = stroke.length_2d
- tmp = self._ratio*slength
- maxT = 0.0
- if tmp < self._thicknessMax:
- maxT = tmp
- else:
- maxT = self._thicknessMax
- n = stroke.stroke_vertices_size()
- i = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- att = it.object.attribute
- c = float(i)/float(n)
- if i < float(n)/2.0:
- t = (1.0 - c)*self._thicknessMin + c * maxT
- else:
- t = (1.0 - c)*maxT + c * self._thicknessMin
- att.thickness = (t/2.0, t/2.0)
- if i == n-1:
- att.thickness = (self._thicknessMin/2.0, self._thicknessMin/2.0)
- i = i+1
- it.increment()
-
-class pyDecreasingThicknessShader(StrokeShader):
- def __init__(self, thicknessMin, thicknessMax):
- StrokeShader.__init__(self)
- self._thicknessMin = thicknessMin
- self._thicknessMax = thicknessMax
- def shade(self, stroke):
- l = stroke.length_2d
- tMax = self._thicknessMax
- if self._thicknessMax > 0.33*l:
- tMax = 0.33*l
- tMin = self._thicknessMin
- if self._thicknessMin > 0.1*l:
- tMin = 0.1*l
- n = stroke.stroke_vertices_size()
- i = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- c = float(i)/float(n)
- t = (1.0 - c)*tMax +c*tMin
- it.object.attribute.thickness = (t/2.0, t/2.0)
- i = i+1
- it.increment()
-
-class pyNonLinearVaryingThicknessShader(StrokeShader):
- def __init__(self, thicknessExtremity, thicknessMiddle, exponent):
- StrokeShader.__init__(self)
- self._thicknessMin = thicknessMiddle
- self._thicknessMax = thicknessExtremity
- self._exponent = exponent
- def shade(self, stroke):
- n = stroke.stroke_vertices_size()
- i = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- if i < float(n)/2.0:
- c = float(i)/float(n)
- else:
- c = float(n-i)/float(n)
- c = self.smoothC(c, self._exponent)
- t = (1.0 - c)*self._thicknessMax + c * self._thicknessMin
- it.object.attribute.thickness = (t/2.0, t/2.0)
- i = i+1
- it.increment()
- def smoothC(self, a, exp):
- return math.pow(float(a), exp) * math.pow(2.0, exp)
-
-## Spherical linear interpolation (cos)
-class pySLERPThicknessShader(StrokeShader):
- def __init__(self, thicknessMin, thicknessMax, omega=1.2):
- StrokeShader.__init__(self)
- self._thicknessMin = thicknessMin
- self._thicknessMax = thicknessMax
- self._omega = omega
- def shade(self, stroke):
- slength = stroke.length_2d
- tmp = 0.33*slength
- maxT = self._thicknessMax
- if tmp < self._thicknessMax:
- maxT = tmp
- n = stroke.stroke_vertices_size()
- i = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- c = float(i)/float(n)
- if i < float(n)/2.0:
- t = math.sin((1-c)*self._omega)/math.sinh(self._omega)*self._thicknessMin + math.sin(c*self._omega)/math.sinh(self._omega) * maxT
- else:
- t = math.sin((1-c)*self._omega)/math.sinh(self._omega)*maxT + math.sin(c*self._omega)/math.sinh(self._omega) * self._thicknessMin
- it.object.attribute.thickness = (t/2.0, t/2.0)
- i = i+1
- it.increment()
-
-class pyTVertexThickenerShader(StrokeShader): ## FIXME
- def __init__(self, a=1.5, n=3):
- StrokeShader.__init__(self)
- self._a = a
- self._n = n
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- predTVertex = pyVertexNatureUP0D(Nature.T_VERTEX)
- while not it.is_end:
- if predTVertex(it) == 1:
- it2 = StrokeVertexIterator(it)
- it2.increment()
- if not (it.is_begin or it2.is_end):
- it.increment()
- continue
- n = self._n
- a = self._a
- if it.is_begin:
- it3 = StrokeVertexIterator(it)
- count = 0
- while (not it3.is_end) and count < n:
- att = it3.object.attribute
- (tr, tl) = att.thickness
- r = (a-1.0)/float(n-1)*(float(n)/float(count+1) - 1) + 1
- #r = (1.0-a)/float(n-1)*count + a
- att.thickness = (r*tr, r*tl)
- it3.increment()
- count = count + 1
- if it2.is_end:
- it4 = StrokeVertexIterator(it)
- count = 0
- while (not it4.is_begin) and count < n:
- att = it4.object.attribute
- (tr, tl) = att.thickness
- r = (a-1.0)/float(n-1)*(float(n)/float(count+1) - 1) + 1
- #r = (1.0-a)/float(n-1)*count + a
- att.thickness = (r*tr, r*tl)
- it4.decrement()
- count = count + 1
- if it4.is_begin:
- att = it4.object.attribute
- (tr, tl) = att.thickness
- r = (a-1.0)/float(n-1)*(float(n)/float(count+1) - 1) + 1
- #r = (1.0-a)/float(n-1)*count + a
- att.thickness = (r*tr, r*tl)
- it.increment()
-
-class pyImportance2DThicknessShader(StrokeShader):
- def __init__(self, x, y, w, kmin, kmax):
- StrokeShader.__init__(self)
- self._x = x
- self._y = y
- self._w = float(w)
- self._kmin = float(kmin)
- self._kmax = float(kmax)
- def shade(self, stroke):
- origin = mathutils.Vector([self._x, self._y])
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- v = it.object
- p = mathutils.Vector([v.projected_x, v.projected_y])
- d = (p-origin).length
- if d > self._w:
- k = self._kmin
- else:
- k = (self._kmax*(self._w-d) + self._kmin*d)/self._w
- att = v.attribute
- (tr, tl) = att.thickness
- att.thickness = (k*tr/2.0, k*tl/2.0)
- it.increment()
-
-class pyImportance3DThicknessShader(StrokeShader):
- def __init__(self, x, y, z, w, kmin, kmax):
- StrokeShader.__init__(self)
- self._x = x
- self._y = y
- self._z = z
- self._w = float(w)
- self._kmin = float(kmin)
- self._kmax = float(kmax)
- def shade(self, stroke):
- origin = mathutils.Vector([self._x, self._y, self._z])
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- v = it.object
- p = v.point_3d
- d = (p-origin).length
- if d > self._w:
- k = self._kmin
- else:
- k = (self._kmax*(self._w-d) + self._kmin*d)/self._w
- att = v.attribute
- (tr, tl) = att.thickness
- att.thickness = (k*tr/2.0, k*tl/2.0)
- it.increment()
-
-class pyZDependingThicknessShader(StrokeShader):
- def __init__(self, min, max):
- StrokeShader.__init__(self)
- self.__min = min
- self.__max = max
- self.__func = GetProjectedZF0D()
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- z_min = 1
- z_max = 0
- while not it.is_end:
- z = self.__func(Interface0DIterator(it))
- if z < z_min:
- z_min = z
- if z > z_max:
- z_max = z
- it.increment()
- z_diff = 1 / (z_max - z_min)
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- z = (self.__func(Interface0DIterator(it)) - z_min) * z_diff
- thickness = (1 - z) * self.__max + z * self.__min
- it.object.attribute.thickness = (thickness, thickness)
- it.increment()
-
-
-## color modifiers
-##################
-
-class pyConstantColorShader(StrokeShader):
- def __init__(self,r,g,b, a = 1):
- StrokeShader.__init__(self)
- self._r = r
- self._g = g
- self._b = b
- self._a = a
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- att = it.object.attribute
- att.color = (self._r, self._g, self._b)
- att.alpha = self._a
- it.increment()
-
-#c1->c2
-class pyIncreasingColorShader(StrokeShader):
- def __init__(self,r1,g1,b1,a1, r2,g2,b2,a2):
- StrokeShader.__init__(self)
- self._c1 = [r1,g1,b1,a1]
- self._c2 = [r2,g2,b2,a2]
- def shade(self, stroke):
- n = stroke.stroke_vertices_size() - 1
- inc = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- att = it.object.attribute
- c = float(inc)/float(n)
-
- att.color = ((1-c)*self._c1[0] + c*self._c2[0],
- (1-c)*self._c1[1] + c*self._c2[1],
- (1-c)*self._c1[2] + c*self._c2[2])
- att.alpha = (1-c)*self._c1[3] + c*self._c2[3]
- inc = inc+1
- it.increment()
-
-# c1->c2->c1
-class pyInterpolateColorShader(StrokeShader):
- def __init__(self,r1,g1,b1,a1, r2,g2,b2,a2):
- StrokeShader.__init__(self)
- self._c1 = [r1,g1,b1,a1]
- self._c2 = [r2,g2,b2,a2]
- def shade(self, stroke):
- n = stroke.stroke_vertices_size() - 1
- inc = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- att = it.object.attribute
- u = float(inc)/float(n)
- c = 1-2*(math.fabs(u-0.5))
- att.color = ((1-c)*self._c1[0] + c*self._c2[0],
- (1-c)*self._c1[1] + c*self._c2[1],
- (1-c)*self._c1[2] + c*self._c2[2])
- att.alpha = (1-c)*self._c1[3] + c*self._c2[3]
- inc = inc+1
- it.increment()
-
-class pyMaterialColorShader(StrokeShader):
- def __init__(self, threshold=50):
- StrokeShader.__init__(self)
- self._threshold = threshold
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- func = MaterialF0D()
- xn = 0.312713
- yn = 0.329016
- Yn = 1.0
- un = 4.* xn/ ( -2.*xn + 12.*yn + 3. )
- vn= 9.* yn/ ( -2.*xn + 12.*yn +3. )
- while not it.is_end:
- mat = func(Interface0DIterator(it))
-
- r = mat.diffuse[0]
- g = mat.diffuse[1]
- b = mat.diffuse[2]
-
- X = 0.412453*r + 0.35758 *g + 0.180423*b
- Y = 0.212671*r + 0.71516 *g + 0.072169*b
- Z = 0.019334*r + 0.119193*g + 0.950227*b
-
- if X == 0 and Y == 0 and Z == 0:
- X = 0.01
- Y = 0.01
- Z = 0.01
- u = 4.*X / (X + 15.*Y + 3.*Z)
- v = 9.*Y / (X + 15.*Y + 3.*Z)
-
- L= 116. * math.pow((Y/Yn),(1./3.)) -16
- U = 13. * L * (u - un)
- V = 13. * L * (v - vn)
-
- if L > self._threshold:
- L = L/1.3
- U = U+10
- else:
- L = L +2.5*(100-L)/5.
- U = U/3.0
- V = V/3.0
- u = U / (13. * L) + un
- v = V / (13. * L) + vn
-
- Y = Yn * math.pow( ((L+16.)/116.), 3.)
- X = -9. * Y * u / ((u - 4.)* v - u * v)
- Z = (9. * Y - 15*v*Y - v*X) /( 3. * v)
-
- r = 3.240479 * X - 1.53715 * Y - 0.498535 * Z
- g = -0.969256 * X + 1.875991 * Y + 0.041556 * Z
- b = 0.055648 * X - 0.204043 * Y + 1.057311 * Z
-
- r = max(0,r)
- g = max(0,g)
- b = max(0,b)
-
- it.object.attribute.color = (r, g, b)
- it.increment()
-
-class pyRandomColorShader(StrokeShader):
- def __init__(self, s=1):
- StrokeShader.__init__(self)
- random.seed(s)
- def shade(self, stroke):
- ## pick a random color
- c0 = float(random.uniform(15,75))/100.0
- c1 = float(random.uniform(15,75))/100.0
- c2 = float(random.uniform(15,75))/100.0
- #print(c0, c1, c2)
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- it.object.attribute.color = (c0,c1,c2)
- it.increment()
-
-class py2DCurvatureColorShader(StrokeShader):
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- func = Curvature2DAngleF0D()
- while not it.is_end:
- c = func(Interface0DIterator(it))
- if c < 0:
- print("negative 2D curvature")
- color = 10.0 * c/3.1415
- it.object.attribute.color = (color, color, color)
- it.increment()
-
-class pyTimeColorShader(StrokeShader):
- def __init__(self, step=0.01):
- StrokeShader.__init__(self)
- self._t = 0
- self._step = step
- def shade(self, stroke):
- c = self._t*1.0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- it.object.attribute.color = (c,c,c)
- it.increment()
- self._t = self._t+self._step
-
-## geometry modifiers
-
-class pySamplingShader(StrokeShader):
- def __init__(self, sampling):
- StrokeShader.__init__(self)
- self._sampling = sampling
- def shade(self, stroke):
- stroke.resample(float(self._sampling))
- stroke.update_length()
-
-class pyBackboneStretcherShader(StrokeShader):
- def __init__(self, l):
- StrokeShader.__init__(self)
- self._l = l
- def shade(self, stroke):
- it0 = stroke.stroke_vertices_begin()
- it1 = StrokeVertexIterator(it0)
- it1.increment()
- itn = stroke.stroke_vertices_end()
- itn.decrement()
- itn_1 = StrokeVertexIterator(itn)
- itn_1.decrement()
- v0 = it0.object
- v1 = it1.object
- vn_1 = itn_1.object
- vn = itn.object
- p0 = mathutils.Vector([v0.projected_x, v0.projected_y])
- pn = mathutils.Vector([vn.projected_x, vn.projected_y])
- p1 = mathutils.Vector([v1.projected_x, v1.projected_y])
- pn_1 = mathutils.Vector([vn_1.projected_x, vn_1.projected_y])
- d1 = p0-p1
- d1.normalize()
- dn = pn-pn_1
- dn.normalize()
- newFirst = p0+d1*float(self._l)
- newLast = pn+dn*float(self._l)
- v0.point = newFirst
- vn.point = newLast
- stroke.update_length()
-
-class pyLengthDependingBackboneStretcherShader(StrokeShader):
- def __init__(self, l):
- StrokeShader.__init__(self)
- self._l = l
- def shade(self, stroke):
- l = stroke.length_2d
- stretch = self._l*l
- it0 = stroke.stroke_vertices_begin()
- it1 = StrokeVertexIterator(it0)
- it1.increment()
- itn = stroke.stroke_vertices_end()
- itn.decrement()
- itn_1 = StrokeVertexIterator(itn)
- itn_1.decrement()
- v0 = it0.object
- v1 = it1.object
- vn_1 = itn_1.object
- vn = itn.object
- p0 = mathutils.Vector([v0.projected_x, v0.projected_y])
- pn = mathutils.Vector([vn.projected_x, vn.projected_y])
- p1 = mathutils.Vector([v1.projected_x, v1.projected_y])
- pn_1 = mathutils.Vector([vn_1.projected_x, vn_1.projected_y])
- d1 = p0-p1
- d1.normalize()
- dn = pn-pn_1
- dn.normalize()
- newFirst = p0+d1*float(stretch)
- newLast = pn+dn*float(stretch)
- v0.point = newFirst
- vn.point = newLast
- stroke.update_length()
-
-
-## Shader to replace a stroke by its corresponding tangent
-class pyGuidingLineShader(StrokeShader):
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin() ## get the first vertex
- itlast = stroke.stroke_vertices_end() ##
- itlast.decrement() ## get the last one
- t = itlast.object.point - it.object.point ## tangent direction
- itmiddle = StrokeVertexIterator(it) ##
- while itmiddle.object.u < 0.5: ## look for the stroke middle vertex
- itmiddle.increment() ##
- it = StrokeVertexIterator(itmiddle)
- it.increment()
- while not it.is_end: ## position all the vertices along the tangent for the right part
- it.object.point = itmiddle.object.point \
- +t*(it.object.u-itmiddle.object.u)
- it.increment()
- it = StrokeVertexIterator(itmiddle)
- it.decrement()
- while not it.is_begin: ## position all the vertices along the tangent for the left part
- it.object.point = itmiddle.object.point \
- -t*(itmiddle.object.u-it.object.u)
- it.decrement()
- it.object.point = itmiddle.object.point-t*itmiddle.object.u ## first vertex
- stroke.update_length()
-
-
-class pyBackboneStretcherNoCuspShader(StrokeShader):
- def __init__(self, l):
- StrokeShader.__init__(self)
- self._l = l
- def shade(self, stroke):
- it0 = stroke.stroke_vertices_begin()
- it1 = StrokeVertexIterator(it0)
- it1.increment()
- itn = stroke.stroke_vertices_end()
- itn.decrement()
- itn_1 = StrokeVertexIterator(itn)
- itn_1.decrement()
- v0 = it0.object
- v1 = it1.object
- if (v0.nature & Nature.CUSP) == 0 and (v1.nature & Nature.CUSP) == 0:
- p0 = v0.point
- p1 = v1.point
- d1 = p0-p1
- d1.normalize()
- newFirst = p0+d1*float(self._l)
- v0.point = newFirst
- vn_1 = itn_1.object
- vn = itn.object
- if (vn.nature & Nature.CUSP) == 0 and (vn_1.nature & Nature.CUSP) == 0:
- pn = vn.point
- pn_1 = vn_1.point
- dn = pn-pn_1
- dn.normalize()
- newLast = pn+dn*float(self._l)
- vn.point = newLast
- stroke.update_length()
-
-class pyDiffusion2Shader(StrokeShader):
- """This shader iteratively adds an offset to the position of each
- stroke vertex in the direction perpendicular to the stroke direction
- at the point. The offset is scaled by the 2D curvature (i.e., how
- quickly the stroke curve is) at the point."""
- def __init__(self, lambda1, nbIter):
- StrokeShader.__init__(self)
- self._lambda = lambda1
- self._nbIter = nbIter
- self._normalInfo = Normal2DF0D()
- self._curvatureInfo = Curvature2DAngleF0D()
- def shade(self, stroke):
- for i in range (1, self._nbIter):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- v = it.object
- p1 = v.point
- p2 = self._normalInfo(Interface0DIterator(it))*self._lambda*self._curvatureInfo(Interface0DIterator(it))
- v.point = p1+p2
- it.increment()
- stroke.update_length()
-
-class pyTipRemoverShader(StrokeShader):
- def __init__(self, l):
- StrokeShader.__init__(self)
- self._l = l
- def shade(self, stroke):
- originalSize = stroke.stroke_vertices_size()
- if originalSize < 4:
- return
- verticesToRemove = []
- oldAttributes = []
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- v = it.object
- if v.curvilinear_abscissa < self._l or v.stroke_length-v.curvilinear_abscissa < self._l:
- verticesToRemove.append(v)
- oldAttributes.append(StrokeAttribute(v.attribute))
- it.increment()
- if originalSize-len(verticesToRemove) < 2:
- return
- for sv in verticesToRemove:
- stroke.remove_vertex(sv)
- stroke.update_length()
- stroke.resample(originalSize)
- if stroke.stroke_vertices_size() != originalSize:
- print("pyTipRemover: Warning: resampling problem")
- it = stroke.stroke_vertices_begin()
- for a in oldAttributes:
- if it.is_end:
- break
- it.object.attribute = a
- it.increment()
- stroke.update_length()
-
-class pyTVertexRemoverShader(StrokeShader):
- def shade(self, stroke):
- if stroke.stroke_vertices_size() <= 3:
- return
- predTVertex = pyVertexNatureUP0D(Nature.T_VERTEX)
- it = stroke.stroke_vertices_begin()
- itlast = stroke.stroke_vertices_end()
- itlast.decrement()
- if predTVertex(it):
- stroke.remove_vertex(it.object)
- if predTVertex(itlast):
- stroke.remove_vertex(itlast.object)
- stroke.update_length()
-
-#class pyExtremitiesOrientationShader(StrokeShader):
-# def __init__(self, x1,y1,x2=0,y2=0):
-# StrokeShader.__init__(self)
-# self._v1 = mathutils.Vector([x1,y1])
-# self._v2 = mathutils.Vector([x2,y2])
-# def shade(self, stroke):
-# #print(self._v1.x,self._v1.y)
-# stroke.setBeginningOrientation(self._v1.x,self._v1.y)
-# stroke.setEndingOrientation(self._v2.x,self._v2.y)
-
-class pyHLRShader(StrokeShader):
- def shade(self, stroke):
- originalSize = stroke.stroke_vertices_size()
- if originalSize < 4:
- return
- it = stroke.stroke_vertices_begin()
- invisible = 0
- it2 = StrokeVertexIterator(it)
- it2.increment()
- fe = self.get_fedge(it.object, it2.object)
- if fe.viewedge.qi != 0:
- invisible = 1
- while not it2.is_end:
- v = it.object
- vnext = it2.object
- if (v.nature & Nature.VIEW_VERTEX) != 0:
- #if (v.nature & Nature.T_VERTEX) != 0:
- fe = self.get_fedge(v, vnext)
- qi = fe.viewedge.qi
- if qi != 0:
- invisible = 1
- else:
- invisible = 0
- if invisible:
- v.attribute.visible = False
- it.increment()
- it2.increment()
- def get_fedge(self, it1, it2):
- return it1.get_fedge(it2)
-
-class pyTVertexOrientationShader(StrokeShader):
- def __init__(self):
- StrokeShader.__init__(self)
- self._Get2dDirection = Orientation2DF1D()
- ## finds the TVertex orientation from the TVertex and
- ## the previous or next edge
- def findOrientation(self, tv, ve):
- mateVE = tv.get_mate(ve)
- if ve.qi != 0 or mateVE.qi != 0:
- ait = AdjacencyIterator(tv,1,0)
- winner = None
- incoming = True
- while not ait.is_end:
- ave = ait.object
- if ave.id != ve.id and ave.id != mateVE.id:
- winner = ait.object
- if not ait.isIncoming(): # FIXME
- incoming = False
- break
- ait.increment()
- if winner is not None:
- if not incoming:
- direction = self._Get2dDirection(winner.last_fedge)
- else:
- direction = self._Get2dDirection(winner.first_fedge)
- return direction
- return None
- def castToTVertex(self, cp):
- if cp.t2d() == 0.0:
- return cp.first_svertex.viewvertex
- elif cp.t2d() == 1.0:
- return cp.second_svertex.viewvertex
- return None
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- it2 = StrokeVertexIterator(it)
- it2.increment()
- ## case where the first vertex is a TVertex
- v = it.object
- if (v.nature & Nature.T_VERTEX) != 0:
- tv = self.castToTVertex(v)
- if tv is not None:
- ve = self.get_fedge(v, it2.object).viewedge
- dir = self.findOrientation(tv, ve)
- if dir is not None:
- #print(dir.x, dir.y)
- v.attribute.set_attribute_vec2("orientation", dir)
- while not it2.is_end:
- vprevious = it.object
- v = it2.object
- if (v.nature & Nature.T_VERTEX) != 0:
- tv = self.castToTVertex(v)
- if tv is not None:
- ve = self.get_fedge(vprevious, v).viewedge
- dir = self.findOrientation(tv, ve)
- if dir is not None:
- #print(dir.x, dir.y)
- v.attribute.set_attribute_vec2("orientation", dir)
- it.increment()
- it2.increment()
- ## case where the last vertex is a TVertex
- v = it.object
- if (v.nature & Nature.T_VERTEX) != 0:
- itPrevious = StrokeVertexIterator(it)
- itPrevious.decrement()
- tv = self.castToTVertex(v)
- if tv is not None:
- ve = self.get_fedge(itPrevious.object, v).viewedge
- dir = self.findOrientation(tv, ve)
- if dir is not None:
- #print(dir.x, dir.y)
- v.attribute.set_attribute_vec2("orientation", dir)
- def get_fedge(self, it1, it2):
- return it1.get_fedge(it2)
-
-class pySinusDisplacementShader(StrokeShader):
- def __init__(self, f, a):
- StrokeShader.__init__(self)
- self._f = f
- self._a = a
- self._getNormal = Normal2DF0D()
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- v = it.object
- #print(self._getNormal.name)
- n = self._getNormal(Interface0DIterator(it))
- p = v.point
- u = v.u
- a = self._a*(1-2*(math.fabs(u-0.5)))
- n = n*a*math.cos(self._f*u*6.28)
- #print(n.x, n.y)
- v.point = p+n
- #v.point = v.point+n*a*math.cos(f*v.u)
- it.increment()
- stroke.update_length()
-
-class pyPerlinNoise1DShader(StrokeShader):
- def __init__(self, freq = 10, amp = 10, oct = 4, seed = -1):
- StrokeShader.__init__(self)
- self.__noise = Noise(seed)
- self.__freq = freq
- self.__amp = amp
- self.__oct = oct
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- v = it.object
- i = v.projected_x + v.projected_y
- nres = self.__noise.turbulence1(i, self.__freq, self.__amp, self.__oct)
- v.point = (v.projected_x + nres, v.projected_y + nres)
- it.increment()
- stroke.update_length()
-
-class pyPerlinNoise2DShader(StrokeShader):
- def __init__(self, freq = 10, amp = 10, oct = 4, seed = -1):
- StrokeShader.__init__(self)
- self.__noise = Noise(seed)
- self.__freq = freq
- self.__amp = amp
- self.__oct = oct
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- v = it.object
- vec = mathutils.Vector([v.projected_x, v.projected_y])
- nres = self.__noise.turbulence2(vec, self.__freq, self.__amp, self.__oct)
- v.point = (v.projected_x + nres, v.projected_y + nres)
- it.increment()
- stroke.update_length()
-
-class pyBluePrintCirclesShader(StrokeShader):
- def __init__(self, turns = 1, random_radius = 3, random_center = 5):
- StrokeShader.__init__(self)
- self.__turns = turns
- self.__random_center = random_center
- self.__random_radius = random_radius
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- if it.is_end:
- return
- p_min = it.object.point.copy()
- p_max = it.object.point.copy()
- while not it.is_end:
- p = it.object.point
- if p.x < p_min.x:
- p_min.x = p.x
- if p.x > p_max.x:
- p_max.x = p.x
- if p.y < p_min.y:
- p_min.y = p.y
- if p.y > p_max.y:
- p_max.y = p.y
- it.increment()
- stroke.resample(32 * self.__turns)
- sv_nb = stroke.stroke_vertices_size()
-# print("min :", p_min.x, p_min.y) # DEBUG
-# print("mean :", p_sum.x, p_sum.y) # DEBUG
-# print("max :", p_max.x, p_max.y) # DEBUG
-# print("----------------------") # DEBUG
-#######################################################
- sv_nb = sv_nb // self.__turns
- center = (p_min + p_max) / 2
- radius = (center.x - p_min.x + center.y - p_min.y) / 2
- p_new = mathutils.Vector([0, 0])
-#######################################################
- R = self.__random_radius
- C = self.__random_center
- i = 0
- it = stroke.stroke_vertices_begin()
- for j in range(self.__turns):
- prev_radius = radius
- prev_center = center
- radius = radius + random.randint(-R, R)
- center = center + mathutils.Vector([random.randint(-C, C), random.randint(-C, C)])
- while i < sv_nb and not it.is_end:
- t = float(i) / float(sv_nb - 1)
- r = prev_radius + (radius - prev_radius) * t
- c = prev_center + (center - prev_center) * t
- p_new.x = c.x + r * math.cos(2 * math.pi * t)
- p_new.y = c.y + r * math.sin(2 * math.pi * t)
- it.object.point = p_new
- i = i + 1
- it.increment()
- i = 1
- verticesToRemove = []
- while not it.is_end:
- verticesToRemove.append(it.object)
- it.increment()
- for sv in verticesToRemove:
- stroke.remove_vertex(sv)
- stroke.update_length()
-
-class pyBluePrintEllipsesShader(StrokeShader):
- def __init__(self, turns = 1, random_radius = 3, random_center = 5):
- StrokeShader.__init__(self)
- self.__turns = turns
- self.__random_center = random_center
- self.__random_radius = random_radius
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- if it.is_end:
- return
- p_min = it.object.point.copy()
- p_max = it.object.point.copy()
- while not it.is_end:
- p = it.object.point
- if p.x < p_min.x:
- p_min.x = p.x
- if p.x > p_max.x:
- p_max.x = p.x
- if p.y < p_min.y:
- p_min.y = p.y
- if p.y > p_max.y:
- p_max.y = p.y
- it.increment()
- stroke.resample(32 * self.__turns)
- sv_nb = stroke.stroke_vertices_size()
- sv_nb = sv_nb // self.__turns
- center = (p_min + p_max) / 2
- radius = center - p_min
- p_new = mathutils.Vector([0, 0])
-#######################################################
- R = self.__random_radius
- C = self.__random_center
- i = 0
- it = stroke.stroke_vertices_begin()
- for j in range(self.__turns):
- prev_radius = radius
- prev_center = center
- radius = radius + mathutils.Vector([random.randint(-R, R), random.randint(-R, R)])
- center = center + mathutils.Vector([random.randint(-C, C), random.randint(-C, C)])
- while i < sv_nb and not it.is_end:
- t = float(i) / float(sv_nb - 1)
- r = prev_radius + (radius - prev_radius) * t
- c = prev_center + (center - prev_center) * t
- p_new.x = c.x + r.x * math.cos(2 * math.pi * t)
- p_new.y = c.y + r.y * math.sin(2 * math.pi * t)
- it.object.point = p_new
- i = i + 1
- it.increment()
- i = 1
- verticesToRemove = []
- while not it.is_end:
- verticesToRemove.append(it.object)
- it.increment()
- for sv in verticesToRemove:
- stroke.remove_vertex(sv)
- stroke.update_length()
-
-
-class pyBluePrintSquaresShader(StrokeShader):
- def __init__(self, turns = 1, bb_len = 10, bb_rand = 0):
- StrokeShader.__init__(self)
- self.__turns = turns
- self.__bb_len = bb_len
- self.__bb_rand = bb_rand
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- if it.is_end:
- return
- p_min = it.object.point.copy()
- p_max = it.object.point.copy()
- while not it.is_end:
- p = it.object.point
- if p.x < p_min.x:
- p_min.x = p.x
- if p.x > p_max.x:
- p_max.x = p.x
- if p.y < p_min.y:
- p_min.y = p.y
- if p.y > p_max.y:
- p_max.y = p.y
- it.increment()
- stroke.resample(32 * self.__turns)
- sv_nb = stroke.stroke_vertices_size()
-#######################################################
- sv_nb = sv_nb // self.__turns
- first = sv_nb // 4
- second = 2 * first
- third = 3 * first
- fourth = sv_nb
- p_first = mathutils.Vector([p_min.x - self.__bb_len, p_min.y])
- p_first_end = mathutils.Vector([p_max.x + self.__bb_len, p_min.y])
- p_second = mathutils.Vector([p_max.x, p_min.y - self.__bb_len])
- p_second_end = mathutils.Vector([p_max.x, p_max.y + self.__bb_len])
- p_third = mathutils.Vector([p_max.x + self.__bb_len, p_max.y])
- p_third_end = mathutils.Vector([p_min.x - self.__bb_len, p_max.y])
- p_fourth = mathutils.Vector([p_min.x, p_max.y + self.__bb_len])
- p_fourth_end = mathutils.Vector([p_min.x, p_min.y - self.__bb_len])
-#######################################################
- R = self.__bb_rand
- r = self.__bb_rand // 2
- it = stroke.stroke_vertices_begin()
- visible = True
- for j in range(self.__turns):
- p_first = p_first + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
- p_first_end = p_first_end + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
- p_second = p_second + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
- p_second_end = p_second_end + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
- p_third = p_third + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
- p_third_end = p_third_end + mathutils.Vector([random.randint(-R, R), random.randint(-r, r)])
- p_fourth = p_fourth + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
- p_fourth_end = p_fourth_end + mathutils.Vector([random.randint(-r, r), random.randint(-R, R)])
- vec_first = p_first_end - p_first
- vec_second = p_second_end - p_second
- vec_third = p_third_end - p_third
- vec_fourth = p_fourth_end - p_fourth
- i = 0
- while i < sv_nb and not it.is_end:
- if i < first:
- p_new = p_first + vec_first * float(i)/float(first - 1)
- if i == first - 1:
- visible = False
- elif i < second:
- p_new = p_second + vec_second * float(i - first)/float(second - first - 1)
- if i == second - 1:
- visible = False
- elif i < third:
- p_new = p_third + vec_third * float(i - second)/float(third - second - 1)
- if i == third - 1:
- visible = False
- else:
- p_new = p_fourth + vec_fourth * float(i - third)/float(fourth - third - 1)
- if i == fourth - 1:
- visible = False
- if it.object == None:
- i = i + 1
- it.increment()
- if not visible:
- visible = True
- continue
- it.object.point = p_new
- it.object.attribute.visible = visible
- if not visible:
- visible = True
- i = i + 1
- it.increment()
- verticesToRemove = []
- while not it.is_end:
- verticesToRemove.append(it.object)
- it.increment()
- for sv in verticesToRemove:
- stroke.remove_vertex(sv)
- stroke.update_length()
-
-
-class pyBluePrintDirectedSquaresShader(StrokeShader):
- def __init__(self, turns = 1, bb_len = 10, mult = 1):
- StrokeShader.__init__(self)
- self.__mult = mult
- self.__turns = turns
- self.__bb_len = 1 + float(bb_len) / 100
- def shade(self, stroke):
- stroke.resample(32 * self.__turns)
- p_mean = mathutils.Vector([0, 0])
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- p = it.object.point
- p_mean = p_mean + p
- it.increment()
- sv_nb = stroke.stroke_vertices_size()
- p_mean = p_mean / sv_nb
- p_var_xx = 0
- p_var_yy = 0
- p_var_xy = 0
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- p = it.object.point
- p_var_xx = p_var_xx + math.pow(p.x - p_mean.x, 2)
- p_var_yy = p_var_yy + math.pow(p.y - p_mean.y, 2)
- p_var_xy = p_var_xy + (p.x - p_mean.x) * (p.y - p_mean.y)
- it.increment()
- p_var_xx = p_var_xx / sv_nb
- p_var_yy = p_var_yy / sv_nb
- p_var_xy = p_var_xy / sv_nb
-## print(p_var_xx, p_var_yy, p_var_xy)
- trace = p_var_xx + p_var_yy
- det = p_var_xx * p_var_yy - p_var_xy * p_var_xy
- sqrt_coeff = math.sqrt(trace * trace - 4 * det)
- lambda1 = (trace + sqrt_coeff) / 2
- lambda2 = (trace - sqrt_coeff) / 2
-## print(lambda1, lambda2)
- theta = math.atan(2 * p_var_xy / (p_var_xx - p_var_yy)) / 2
-## print(theta)
- if p_var_yy > p_var_xx:
- e1 = mathutils.Vector([math.cos(theta + math.pi / 2), math.sin(theta + math.pi / 2)]) * math.sqrt(lambda1) * self.__mult
- e2 = mathutils.Vector([math.cos(theta + math.pi), math.sin(theta + math.pi)]) * math.sqrt(lambda2) * self.__mult
- else:
- e1 = mathutils.Vector([math.cos(theta), math.sin(theta)]) * math.sqrt(lambda1) * self.__mult
- e2 = mathutils.Vector([math.cos(theta + math.pi / 2), math.sin(theta + math.pi / 2)]) * math.sqrt(lambda2) * self.__mult
-#######################################################
- sv_nb = sv_nb // self.__turns
- first = sv_nb // 4
- second = 2 * first
- third = 3 * first
- fourth = sv_nb
- bb_len1 = self.__bb_len
- bb_len2 = 1 + (bb_len1 - 1) * math.sqrt(lambda1 / lambda2)
- p_first = p_mean - e1 - e2 * bb_len2
- p_second = p_mean - e1 * bb_len1 + e2
- p_third = p_mean + e1 + e2 * bb_len2
- p_fourth = p_mean + e1 * bb_len1 - e2
- vec_first = e2 * bb_len2 * 2
- vec_second = e1 * bb_len1 * 2
- vec_third = vec_first * -1
- vec_fourth = vec_second * -1
-#######################################################
- it = stroke.stroke_vertices_begin()
- visible = True
- for j in range(self.__turns):
- i = 0
- while i < sv_nb:
- if i < first:
- p_new = p_first + vec_first * float(i)/float(first - 1)
- if i == first - 1:
- visible = False
- elif i < second:
- p_new = p_second + vec_second * float(i - first)/float(second - first - 1)
- if i == second - 1:
- visible = False
- elif i < third:
- p_new = p_third + vec_third * float(i - second)/float(third - second - 1)
- if i == third - 1:
- visible = False
- else:
- p_new = p_fourth + vec_fourth * float(i - third)/float(fourth - third - 1)
- if i == fourth - 1:
- visible = False
- it.object.point = p_new
- it.object.attribute.visible = visible
- if not visible:
- visible = True
- i = i + 1
- it.increment()
- verticesToRemove = []
- while not it.is_end:
- verticesToRemove.append(it.object)
- it.increment()
- for sv in verticesToRemove:
- stroke.remove_vertex(sv)
- stroke.update_length()
-
-class pyModulateAlphaShader(StrokeShader):
- def __init__(self, min = 0, max = 1):
- StrokeShader.__init__(self)
- self.__min = min
- self.__max = max
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- alpha = it.object.attribute.alpha
- p = it.object.point
- alpha = alpha * p.y / 400
- if alpha < self.__min:
- alpha = self.__min
- elif alpha > self.__max:
- alpha = self.__max
- it.object.attribute.alpha = alpha
- it.increment()
-
-## various
-class pyDummyShader(StrokeShader):
- def shade(self, stroke):
- it = stroke.stroke_vertices_begin()
- while not it.is_end:
- toto = Interface0DIterator(it)
- att = it.object.attribute
- att.color = (0.3, 0.4, 0.4)
- att.thickness = (0, 5)
- it.increment()
-
-class pyDebugShader(StrokeShader):
- def shade(self, stroke):
- fe = CF.get_selected_fedge()
- id1 = fe.first_svertex.id
- id2 = fe.second_svertex.id
- #print(id1.first, id1.second)
- #print(id2.first, id2.second)
- it = stroke.stroke_vertices_begin()
- found = True
- foundfirst = True
- foundsecond = False
- while not it.is_end:
- cp = it.object
- if cp.first_svertex.id == id1 or cp.second_svertex.id == id1:
- foundfirst = True
- if cp.first_svertex.id == id2 or cp.second_svertex.id == id2:
- foundsecond = True
- if foundfirst and foundsecond:
- found = True
- break
- it.increment()
- if found:
- print("The selected Stroke id is: ", stroke.id.first, stroke.id.second)
diff --git a/release/scripts/freestyle/style_modules/anisotropic_diffusion.py b/release/scripts/freestyle/styles/anisotropic_diffusion.py
index 274381a9a89..274381a9a89 100644
--- a/release/scripts/freestyle/style_modules/anisotropic_diffusion.py
+++ b/release/scripts/freestyle/styles/anisotropic_diffusion.py
diff --git a/release/scripts/freestyle/style_modules/apriori_and_causal_density.py b/release/scripts/freestyle/styles/apriori_and_causal_density.py
index 501c5169be9..501c5169be9 100644
--- a/release/scripts/freestyle/style_modules/apriori_and_causal_density.py
+++ b/release/scripts/freestyle/styles/apriori_and_causal_density.py
diff --git a/release/scripts/freestyle/style_modules/apriori_density.py b/release/scripts/freestyle/styles/apriori_density.py
index f59dd5cd36d..f59dd5cd36d 100644
--- a/release/scripts/freestyle/style_modules/apriori_density.py
+++ b/release/scripts/freestyle/styles/apriori_density.py
diff --git a/release/scripts/freestyle/style_modules/backbone_stretcher.py b/release/scripts/freestyle/styles/backbone_stretcher.py
index c920d66328e..c920d66328e 100644
--- a/release/scripts/freestyle/style_modules/backbone_stretcher.py
+++ b/release/scripts/freestyle/styles/backbone_stretcher.py
diff --git a/release/scripts/freestyle/style_modules/blueprint_circles.py b/release/scripts/freestyle/styles/blueprint_circles.py
index 8d80893a6c3..8d80893a6c3 100644
--- a/release/scripts/freestyle/style_modules/blueprint_circles.py
+++ b/release/scripts/freestyle/styles/blueprint_circles.py
diff --git a/release/scripts/freestyle/style_modules/blueprint_ellipses.py b/release/scripts/freestyle/styles/blueprint_ellipses.py
index d5e9f385c12..d5e9f385c12 100644
--- a/release/scripts/freestyle/style_modules/blueprint_ellipses.py
+++ b/release/scripts/freestyle/styles/blueprint_ellipses.py
diff --git a/release/scripts/freestyle/style_modules/blueprint_squares.py b/release/scripts/freestyle/styles/blueprint_squares.py
index 35cfc33fd69..35cfc33fd69 100644
--- a/release/scripts/freestyle/style_modules/blueprint_squares.py
+++ b/release/scripts/freestyle/styles/blueprint_squares.py
diff --git a/release/scripts/freestyle/style_modules/cartoon.py b/release/scripts/freestyle/styles/cartoon.py
index de4a5dd6c5d..de4a5dd6c5d 100644
--- a/release/scripts/freestyle/style_modules/cartoon.py
+++ b/release/scripts/freestyle/styles/cartoon.py
diff --git a/release/scripts/freestyle/style_modules/contour.py b/release/scripts/freestyle/styles/contour.py
index f7a42d9c906..f7a42d9c906 100644
--- a/release/scripts/freestyle/style_modules/contour.py
+++ b/release/scripts/freestyle/styles/contour.py
diff --git a/release/scripts/freestyle/style_modules/curvature2d.py b/release/scripts/freestyle/styles/curvature2d.py
index 4204951fbd5..4204951fbd5 100644
--- a/release/scripts/freestyle/style_modules/curvature2d.py
+++ b/release/scripts/freestyle/styles/curvature2d.py
diff --git a/release/scripts/freestyle/style_modules/external_contour.py b/release/scripts/freestyle/styles/external_contour.py
index c7e51f1a16d..c7e51f1a16d 100644
--- a/release/scripts/freestyle/style_modules/external_contour.py
+++ b/release/scripts/freestyle/styles/external_contour.py
diff --git a/release/scripts/freestyle/style_modules/external_contour_sketchy.py b/release/scripts/freestyle/styles/external_contour_sketchy.py
index b8d381ea24e..b8d381ea24e 100644
--- a/release/scripts/freestyle/style_modules/external_contour_sketchy.py
+++ b/release/scripts/freestyle/styles/external_contour_sketchy.py
diff --git a/release/scripts/freestyle/style_modules/external_contour_smooth.py b/release/scripts/freestyle/styles/external_contour_smooth.py
index 045984ea4a5..045984ea4a5 100644
--- a/release/scripts/freestyle/style_modules/external_contour_smooth.py
+++ b/release/scripts/freestyle/styles/external_contour_smooth.py
diff --git a/release/scripts/freestyle/style_modules/haloing.py b/release/scripts/freestyle/styles/haloing.py
index feadb5dd1ee..feadb5dd1ee 100644
--- a/release/scripts/freestyle/style_modules/haloing.py
+++ b/release/scripts/freestyle/styles/haloing.py
diff --git a/release/scripts/freestyle/style_modules/ignore_small_occlusions.py b/release/scripts/freestyle/styles/ignore_small_occlusions.py
index a9dc102a0bd..a9dc102a0bd 100644
--- a/release/scripts/freestyle/style_modules/ignore_small_occlusions.py
+++ b/release/scripts/freestyle/styles/ignore_small_occlusions.py
diff --git a/release/scripts/freestyle/style_modules/invisible_lines.py b/release/scripts/freestyle/styles/invisible_lines.py
index b94d7b0a5fd..b94d7b0a5fd 100644
--- a/release/scripts/freestyle/style_modules/invisible_lines.py
+++ b/release/scripts/freestyle/styles/invisible_lines.py
diff --git a/release/scripts/freestyle/style_modules/japanese_bigbrush.py b/release/scripts/freestyle/styles/japanese_bigbrush.py
index 4a5203fb31f..4a5203fb31f 100644
--- a/release/scripts/freestyle/style_modules/japanese_bigbrush.py
+++ b/release/scripts/freestyle/styles/japanese_bigbrush.py
diff --git a/release/scripts/freestyle/style_modules/long_anisotropically_dense.py b/release/scripts/freestyle/styles/long_anisotropically_dense.py
index f181bcc2348..f181bcc2348 100644
--- a/release/scripts/freestyle/style_modules/long_anisotropically_dense.py
+++ b/release/scripts/freestyle/styles/long_anisotropically_dense.py
diff --git a/release/scripts/freestyle/style_modules/multiple_parameterization.py b/release/scripts/freestyle/styles/multiple_parameterization.py
index 922a5ffa972..922a5ffa972 100644
--- a/release/scripts/freestyle/style_modules/multiple_parameterization.py
+++ b/release/scripts/freestyle/styles/multiple_parameterization.py
diff --git a/release/scripts/freestyle/style_modules/nature.py b/release/scripts/freestyle/styles/nature.py
index fdc627ecade..fdc627ecade 100644
--- a/release/scripts/freestyle/style_modules/nature.py
+++ b/release/scripts/freestyle/styles/nature.py
diff --git a/release/scripts/freestyle/style_modules/near_lines.py b/release/scripts/freestyle/styles/near_lines.py
index 977b99546bc..977b99546bc 100644
--- a/release/scripts/freestyle/style_modules/near_lines.py
+++ b/release/scripts/freestyle/styles/near_lines.py
diff --git a/release/scripts/freestyle/style_modules/occluded_by_specific_object.py b/release/scripts/freestyle/styles/occluded_by_specific_object.py
index 0ae0804f2a8..0ae0804f2a8 100644
--- a/release/scripts/freestyle/style_modules/occluded_by_specific_object.py
+++ b/release/scripts/freestyle/styles/occluded_by_specific_object.py
diff --git a/release/scripts/freestyle/style_modules/polygonalize.py b/release/scripts/freestyle/styles/polygonalize.py
index e570e15ca6c..e570e15ca6c 100644
--- a/release/scripts/freestyle/style_modules/polygonalize.py
+++ b/release/scripts/freestyle/styles/polygonalize.py
diff --git a/release/scripts/freestyle/style_modules/qi0.py b/release/scripts/freestyle/styles/qi0.py
index 8386817754a..8386817754a 100644
--- a/release/scripts/freestyle/style_modules/qi0.py
+++ b/release/scripts/freestyle/styles/qi0.py
diff --git a/release/scripts/freestyle/style_modules/qi0_not_external_contour.py b/release/scripts/freestyle/styles/qi0_not_external_contour.py
index 6315f1291d4..6315f1291d4 100644
--- a/release/scripts/freestyle/style_modules/qi0_not_external_contour.py
+++ b/release/scripts/freestyle/styles/qi0_not_external_contour.py
diff --git a/release/scripts/freestyle/style_modules/qi1.py b/release/scripts/freestyle/styles/qi1.py
index d5424b37748..d5424b37748 100644
--- a/release/scripts/freestyle/style_modules/qi1.py
+++ b/release/scripts/freestyle/styles/qi1.py
diff --git a/release/scripts/freestyle/style_modules/qi2.py b/release/scripts/freestyle/styles/qi2.py
index 367139de2e2..367139de2e2 100644
--- a/release/scripts/freestyle/style_modules/qi2.py
+++ b/release/scripts/freestyle/styles/qi2.py
diff --git a/release/scripts/freestyle/style_modules/sequentialsplit_sketchy.py b/release/scripts/freestyle/styles/sequentialsplit_sketchy.py
index 755098d96b6..755098d96b6 100644
--- a/release/scripts/freestyle/style_modules/sequentialsplit_sketchy.py
+++ b/release/scripts/freestyle/styles/sequentialsplit_sketchy.py
diff --git a/release/scripts/freestyle/style_modules/sketchy_multiple_parameterization.py b/release/scripts/freestyle/styles/sketchy_multiple_parameterization.py
index c1cc4361f7f..c1cc4361f7f 100644
--- a/release/scripts/freestyle/style_modules/sketchy_multiple_parameterization.py
+++ b/release/scripts/freestyle/styles/sketchy_multiple_parameterization.py
diff --git a/release/scripts/freestyle/style_modules/sketchy_topology_broken.py b/release/scripts/freestyle/styles/sketchy_topology_broken.py
index 27669d20a4e..27669d20a4e 100644
--- a/release/scripts/freestyle/style_modules/sketchy_topology_broken.py
+++ b/release/scripts/freestyle/styles/sketchy_topology_broken.py
diff --git a/release/scripts/freestyle/style_modules/sketchy_topology_preserved.py b/release/scripts/freestyle/styles/sketchy_topology_preserved.py
index 25e593b7c99..25e593b7c99 100644
--- a/release/scripts/freestyle/style_modules/sketchy_topology_preserved.py
+++ b/release/scripts/freestyle/styles/sketchy_topology_preserved.py
diff --git a/release/scripts/freestyle/style_modules/split_at_highest_2d_curvatures.py b/release/scripts/freestyle/styles/split_at_highest_2d_curvatures.py
index 474183c3810..474183c3810 100644
--- a/release/scripts/freestyle/style_modules/split_at_highest_2d_curvatures.py
+++ b/release/scripts/freestyle/styles/split_at_highest_2d_curvatures.py
diff --git a/release/scripts/freestyle/style_modules/split_at_tvertices.py b/release/scripts/freestyle/styles/split_at_tvertices.py
index 70446e0178e..70446e0178e 100644
--- a/release/scripts/freestyle/style_modules/split_at_tvertices.py
+++ b/release/scripts/freestyle/styles/split_at_tvertices.py
diff --git a/release/scripts/freestyle/style_modules/stroke_texture.py b/release/scripts/freestyle/styles/stroke_texture.py
index c925633b579..c925633b579 100644
--- a/release/scripts/freestyle/style_modules/stroke_texture.py
+++ b/release/scripts/freestyle/styles/stroke_texture.py
diff --git a/release/scripts/freestyle/style_modules/suggestive.py b/release/scripts/freestyle/styles/suggestive.py
index bb5e20f2a2e..bb5e20f2a2e 100644
--- a/release/scripts/freestyle/style_modules/suggestive.py
+++ b/release/scripts/freestyle/styles/suggestive.py
diff --git a/release/scripts/freestyle/style_modules/thickness_fof_depth_discontinuity.py b/release/scripts/freestyle/styles/thickness_fof_depth_discontinuity.py
index 20c8240b3cd..20c8240b3cd 100644
--- a/release/scripts/freestyle/style_modules/thickness_fof_depth_discontinuity.py
+++ b/release/scripts/freestyle/styles/thickness_fof_depth_discontinuity.py
diff --git a/release/scripts/freestyle/style_modules/tipremover.py b/release/scripts/freestyle/styles/tipremover.py
index efcddb7321f..efcddb7321f 100644
--- a/release/scripts/freestyle/style_modules/tipremover.py
+++ b/release/scripts/freestyle/styles/tipremover.py
diff --git a/release/scripts/freestyle/style_modules/tvertex_remover.py b/release/scripts/freestyle/styles/tvertex_remover.py
index 565962c1b0e..565962c1b0e 100644
--- a/release/scripts/freestyle/style_modules/tvertex_remover.py
+++ b/release/scripts/freestyle/styles/tvertex_remover.py
diff --git a/release/scripts/freestyle/style_modules/uniformpruning_zsort.py b/release/scripts/freestyle/styles/uniformpruning_zsort.py
index efa977ccf5f..efa977ccf5f 100644
--- a/release/scripts/freestyle/style_modules/uniformpruning_zsort.py
+++ b/release/scripts/freestyle/styles/uniformpruning_zsort.py