Welcome to mirror list, hosted at ThFree Co, Russian Federation.

git.blender.org/blender.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorSergey Sharybin <sergey.vfx@gmail.com>2016-04-04 15:10:09 +0300
committerSergey Sharybin <sergey.vfx@gmail.com>2016-04-04 15:13:19 +0300
commitbe2186ad629a6dc7057627e56b404db212de2deb (patch)
tree644956cf42c609a97a7a2843c28be5de9d9fac27 /intern/cycles/util
parent5ab3a97dbbcfe2cb2bc7093f5e18a195eb31f080 (diff)
Cycles: Solve possible issues with running out of stack memory allocator
Policy here is a bit more complicated, if tree becomes too deep we're forced to create a leaf node and size of that leaf wouldn't be so well predicted, which means it's quite tricky to use single stack array for that. Made it more official feature that StackAllocator will fall-back to heap when running out of stack memory. It's still much better than always using heap allocator.
Diffstat (limited to 'intern/cycles/util')
-rw-r--r--intern/cycles/util/util_stack_allocator.h25
1 files changed, 16 insertions, 9 deletions
diff --git a/intern/cycles/util/util_stack_allocator.h b/intern/cycles/util/util_stack_allocator.h
index 85edd8061ee..29260888eef 100644
--- a/intern/cycles/util/util_stack_allocator.h
+++ b/intern/cycles/util/util_stack_allocator.h
@@ -54,28 +54,35 @@ public:
T *allocate(size_t n, const void *hint = 0)
{
(void)hint;
+ if(n == 0) {
+ return NULL;
+ }
if(pointer_ + n >= SIZE) {
- assert(!"Stack allocator overallocated");
- /* NOTE: This is just a safety feature for the release builds, so
- * we fallback to a less efficient allocation but preventing crash
- * from happening.
- */
- return (T*)malloc(n * sizeof(T));
+ size_t size = n * sizeof(T);
+ util_guarded_mem_alloc(size);
+#ifdef WITH_BLENDER_GUARDEDALLOC
+ return (T*)MEM_mallocN_aligned(size, 16, "Cycles Alloc");
+#else
+ return (T*)malloc(size);
+#endif
}
T *mem = &data_[pointer_];
pointer_ += n;
return mem;
}
- void deallocate(T *p, size_t /*n*/)
+ void deallocate(T *p, size_t n)
{
if(p == NULL) {
return;
}
if(p < data_ || p >= data_ + SIZE) {
- /* Again this is just a safety feature for the release builds. */
- assert(!"Should never happen");
+ util_guarded_mem_free(n * sizeof(T));
+#ifdef WITH_BLENDER_GUARDEDALLOC
+ MEM_freeN(p);
+#else
free(p);
+#endif
return;
}
/* We don't support memory free for the stack allocator. */