I am creating a mesh at runtime from a native plugin by creating a new mesh and calling GetNativeIndexBufferPtr and GetNativeVertexBufferPtr, then filling them in from the native side.
Now I want to import skinning data as well. But when I set renderer.bones from the Unity side, it seems to have no effect, and one time I got an error about the mesh needing to be set read/write at import. (I am not importing it, but I am setting it to be dynamic so that I can fill the vertex and index buffers at runtime from my plugin).
I am guessing that just like setting vertices or indices on a mesh from the Unity side has no effect once you get the native pointers, you can't set the bones. But I would like to confirm this. Is there any way to set the bones on a SkinnedMeshRenderer using a mesh created from a native plugin?
↧
Setting SkinnedMeshRenderer bones from a native plugin
↧
Native Plugin Texture2d rendering fails in build
Hey peeps, I am playing with native plugin texture2d manipulation where I attach a texture2d to a quad's material and then pass the texture2d's ptr to a DLL. Similar to this doc https://docs.unity3d.com/Manual/NativePluginInterface.html. Then I pass the image from my webcam to my DLL as well and update the texture2d with my GPU. This works in the editor but I have no idea why the image data seems to break in the build.
Working in editor:
![alt text][1]
Failing in build:
![alt text][2]
[1]: http://i1376.photobucket.com/albums/ah18/email2jie/WorkingInEditor_zpsija256uu.jpg
[2]: http://i1376.photobucket.com/albums/ah18/email2jie/failure_zpsahueojfn.jpg
↧
↧
WinSparkle dll not being recognize by Unity - Other Native dll is
Hello.
**Goal:** Get WinSparkle.dll to run in the Editor via the Plugins folder.
I am trying to use the WinSparkle native DLL with my Unity Project. PlayMode/Editor is unable to find the .dll To test whether I set up the WinSparkle.dll correctly or not (correct directory, correct import attributes, etc), I made a test.dll C++ dll and I can access it as expected.
I'm sure I'm missing something really basic. All help us appreciated.
**What I've Done So Far:**
- Confirmed the DLLs are in the Plugins folder - Confirmed DLL's import settings allow it to run in the Editor and in Standalone Windows. - Confirmed I am running .Net 4.x in the Unity Settings and the API level is set to 4.0 - Tried both x64 and x86 versions of the DLL in the Plugins folder. - Created Unmanaged_Test.dll and confirmed it works in any namespace and any class. - Built the project on Win Standalone. Though the Plugin is included in the build, my scripts cannot find it if the .dll is in the Plugins folder. On the contrary, if I manually dump it at the root level, right beside the .exe, everything works. - Placed the WinSparkle dll in ProgramFiles/Unity/Hub/Editor directory _(This gets it working in PlayMode)_ - Started a second project and repeated my steps
**Additional Info:**
- Unity Version: 2018.1.6f1 - Visual Studio 2017 - WinSparkle was downloaded directly from source. When that failed, I built the [GitHub project][1] [1]: https://github.com/vslavik/winsparkle
**Goal:** Get WinSparkle.dll to run in the Editor via the Plugins folder.
I am trying to use the WinSparkle native DLL with my Unity Project. PlayMode/Editor is unable to find the .dll To test whether I set up the WinSparkle.dll correctly or not (correct directory, correct import attributes, etc), I made a test.dll C++ dll and I can access it as expected.
I'm sure I'm missing something really basic. All help us appreciated.
**What I've Done So Far:**
- Confirmed the DLLs are in the Plugins folder - Confirmed DLL's import settings allow it to run in the Editor and in Standalone Windows. - Confirmed I am running .Net 4.x in the Unity Settings and the API level is set to 4.0 - Tried both x64 and x86 versions of the DLL in the Plugins folder. - Created Unmanaged_Test.dll and confirmed it works in any namespace and any class. - Built the project on Win Standalone. Though the Plugin is included in the build, my scripts cannot find it if the .dll is in the Plugins folder. On the contrary, if I manually dump it at the root level, right beside the .exe, everything works. - Placed the WinSparkle dll in ProgramFiles/Unity/Hub/Editor directory _(This gets it working in PlayMode)_ - Started a second project and repeated my steps
**Additional Info:**
- Unity Version: 2018.1.6f1 - Visual Studio 2017 - WinSparkle was downloaded directly from source. When that failed, I built the [GitHub project][1] [1]: https://github.com/vslavik/winsparkle
↧
Alarm app using unity c#
Hi, does anyone know any plugin or any method to make an alarm/reminder app using unity. I found a plugin named "Android goodies" it gives the functionality of alarm using Android default clock app but I don't want that. I want to make alarm app that don't leave the unity app. App should set alarm staying within the unity app. I'm attaching a screenshot of what I want to develop. Please help if anyone knows anything about my question. thanks![alt text][1]
[1]: /storage/temp/121988-alarm-question.png
↧
Native Libraries in windows unity personal 2018
I am trying to run my x86 c++ dlls in my Unity Personal 64bit edition. but it keeps giving dll not found error when i run it. Are they even supported? if they are, where do i need to place them??
↧
↧
Audio Spatializer Plugin SDK: how can plugin get channel count of input audio source?
Hello Unity friends!
I am working on a Spatializer plugin and my plugin DSP code and I understand that Unity passes audio to my plugin as stereo, regardless of whether the underlying audio Clip is mono, stereo or multi-channel, by up or down-mixing as necessary.
So given that my plugin will always "see" audio as stereo, how can I access per instance (i.e. per audio source) info on e.g. channel count of the source. The documentation on spatializer instance data includes the following:
struct UnityAudioSpatializerData
{
float listenermatrix[16]; // Matrix that transforms sourcepos into the local space of the listener
float sourcematrix[16]; // Transform matrix of audio source
float spatialblend; // Distance-controlled spatial blend
float reverbzonemix; // Reverb zone mix level parameter (and curve) on audio source
float spread; // Spread parameter of the audio source (0..360 degrees)
float stereopan; // Stereo panning parameter of the audio source (-1: fully left, 1: fully right)
// The spatializer plugin may override the distance attenuation in order to
// influence the voice prioritization (leave this callback as NULL to use the
// built-in audio source attenuation curve)
UnityAudioEffect_DistanceAttenuationCallback distanceattenuationcallback;
};
I would have thought that a channel count in this struct would be useful. So is there a way to infer the mono/stereo/multi channel nature of the audio source from the above? (e.g. maybe stereopan is set to a specific value for mono sources?) It seems I must be missing something here because it's clear that a spatializer would want to spatialize an audio source differently depending on its number of channels.
Thanks!
↧
Does Unity support mixed assemblies (with both native and .Net code) in a single DLL?
Hi,
I have a DLL written in C++. It also contains a .NET API written in C++/CLI, and these are linked together in a single DLL. The C++ code uses some Win32-specific calls, so I'm happy to be restricted to Windows platforms for now.
From any regular C# project in Visual Studio, I can just add it as a reference and use the .NET API to seamlessly interact with the library - no need for P/Invoke, `DllImport` or writing a C API.
However, when I import it as an asset into a Unity project, it's recognized as a native plugin, and I can't use the C# API from any scripts in the project - they all behave as if the DLL doesn't exist. The C++/CLI code is compiled against .Net Framework 4.0, so I've set the Scripting Runtime Version to '.NET 4.x Equivalent', and the Api Compatibility Level to '.NET 4.x'.
I tried working around this by writing a C# library in Visual Studio that references the mixed C++/.Net library, and then importing both this C# library and the mixed library into the Unity project. When I do that the scripts in the Unity project compile, but the entire editor crashes when I press play.
Are mixed assemblies just not supported in Unity? If so, I couldn't find this documented anywhere. If they are, how can I figure out why Unity is treating my mixed assembly DLL as a native plugin rather than a managed plugin?
↧
Native audio plugin xcode build problem "undefined symbols"
Hello,
I am trying out the native audio sdk. I went basicaly by the docs manual https://docs.unity3d.com/Manual/AudioMixerNativeAudioPlugin.html . I downloaded the SDK, created a small plugin with this tutorial (https://medium.com/@othnielcundangan/how-to-use-unitys-native-audio-plugin-sdk-on-macos-f7e1bdbc8141) but when i try to build the bundle with plugin i get an error in Xcode saying: undefined symbols for architecture x86_64
I have included the AudioPluginInterface.h as well as AudioPluginUtil.h/cpp and PluginList.h includes only my plugin.
Do I have to include any more libraries or anything? Thank you
![alt text][1]
[1]: /storage/temp/125052-bez-nazvu.png
↧
IL2CPP vs C++ Performance
Hello,
I'm sorry if this question is already discussed, couldn't find anything.
Ok, here is the thing...I build my games using IL2CPP and pretty happy with this in spite of troubles which popup from time to time.
In one of these games - Match3 - I have logic which is looking for combinations and other stuff which is basically algorithms/calcs and so on. You may probably think that this is not a big deal and that's not **very** performance depending code. But when you connect AI which tests/plays this logic and call it thousands times during calculation of better move then it (logic) is getting extremely performance depending.
I'm not gonna discuss all my algorithms here because of this is out of my question. Let's just assume that all these algorithms were optimized as much as possible using C#.
The question is: if I move some of my logic, like iterations through arrays and etc to native c++ plugin will it boost my performance comparing to IL2CPP?
I know there's no 100% guarantee until you see source code but hypothetically... let's just assume we are talking about huge iteration through array.
I've seen some articles where people do benchmarks of native plugin in Unity Editor which is ridiculous imho. Of course array initialization and iteration will be faster in native part comparing to editor's mono with all this managed things.
Also (as far as I know) IL2CPP generates native code for us which is better for performance but it actually emulate managed environment with its own garbage collector and stuff.
So here's question: will my native c++ plugin with marshalling (btw what happens to marshalling when I build using IL2CPP) be faster comparing to c++ generated by IL2CPP.
sry for long question, Thanks!
↧
↧
EntryPointNotFound exception when integrating Picovoice Porcupine
I have been trying to integrate Picovoice Porcupine (https://github.com/Picovoice/Porcupine) into a proof of concept demo built for ARMv7 Android in Unity. I am attempting to access the ARMv7 .so library supplied in the GitHub repo using the DllImport function. The library appears to load correctly (as I do not receive a DllNotFoundException), however I am experiencing EntryPointNotFoundExceptions when trying to call the native functions. I think perhaps I am declaring the function calls incorrectly - if you can help me see where I'm going wrong I'd greatly appreciate it!
I have included the relevant code below, along with the Java native calls from the Android demo I was using as a reference.
**Java reference:**
private native long init(String modelFilePath, String[] keywordFilePaths, float[] sensitivities);
private native int process(long object, short[] pcm);
private native void delete(long object);
**Unity code:**
public class PorcupineManager : ScriptableObject {
...
[DllImport("pv_porcupine")]
private static extern long init(string modelFilePath, string[] keywordFilePaths, float[] sensitivities);
[DllImport("pv_porcupine")]
private static extern int process(long porcupineObjectId, short[] pcm);
[DllImport("pv_porcupine")]
private static extern void delete(long porcupineObjectId);
public void Init() {
porcupineObject = init(Path.Combine(Application.dataPath, modelPath), new string[] { Path.Combine(Application.dataPath, keywordPath)}, new float[] { sensitivity });
}
...
}
**Excerpt from log:**
> 2018-10-16 17:51:06.654 19176-19208/com.meowtek.commandandcontrol E/Unity: EntryPointNotFoundException: init
at (wrapper managed-to-native) PorcupineManager:init (string,string[],single[])> at PorcupineManager.Init () [0x0006e] in /Users/Ronan/Unity Projects/CommandAndControl/Assets/Scripts/Porcupine/PorcupineManager.cs:40> at PorcupineTest+c__Iterator0.MoveNext () [0x0005d] in /Users/Ronan/Unity Projects/CommandAndControl/Assets/Scripts/Porcupine/PorcupineTest.cs:17> at UnityEngine.SetupCoroutine.InvokeMoveNext (IEnumerator enumerator, IntPtr returnValueAddress) [0x00028] in /Users/builduser/buildslave/unity/build/Runtime/Export/Coroutines.cs:17
↧
Native Plugin works in editor but not in build
I'm building a virtual synthesizer in Unity by creating sounds using a c++ DLL. The DLL is being run at a custom thread declared in C# inside of Unity. My problem is that the DLL runs perfectly inside the Editor. No crashes or anything. However, when I try to build and run the standalone windows application it just terminates when reaching> waveOutWrite(m_hwDevice, &m_pWaveHeaders[m_nBlockCurrent], sizeof(WAVEHDR));
at the end of the code.
----------
This is the full header file inside the DLL where the crash occurs:
#pragma once
#pragma comment(lib, "winmm.lib")
#include
#include
#include
#include
#include
#include
#include
#include
#define T short
using namespace std;
class SoundEngine
{
public:
SoundEngine() {}
SoundEngine(wstring sOutputDevice, unsigned int nSampleRate = 44100, unsigned int nChannels = 1, unsigned int nBlocks = 8, unsigned int nBlockSamples = 512)
{
Create(sOutputDevice, nSampleRate, nChannels, nBlocks, nBlockSamples);
}
~SoundEngine()
{
Destroy();
}
bool Create(wstring sOutputDevice, unsigned int nSampleRate = 44100, unsigned int nChannels = 1, unsigned int nBlocks = 8, unsigned int nBlockSamples = 512)
{
m_bReady = false;
m_nSampleRate = nSampleRate;
m_nChannels = nChannels;
m_nBlockCount = nBlocks;
m_nBlockSamples = nBlockSamples;
m_nBlockFree = m_nBlockCount;
m_nBlockCurrent = 0;
m_pBlockMemory = nullptr;
m_pWaveHeaders = nullptr;
// Validate device
vector devices = Enumerate();
auto d = std::find(devices.begin(), devices.end(), sOutputDevice);
if (d != devices.end())
{
// Device is available
int nDeviceID = distance(devices.begin(), d);
WAVEFORMATEX waveFormat;
waveFormat.wFormatTag = WAVE_FORMAT_PCM;
waveFormat.nSamplesPerSec = m_nSampleRate;
waveFormat.wBitsPerSample = sizeof(T) * 8;
waveFormat.nChannels = m_nChannels;
waveFormat.nBlockAlign = (waveFormat.wBitsPerSample / 8) * waveFormat.nChannels;
waveFormat.nAvgBytesPerSec = waveFormat.nSamplesPerSec * waveFormat.nBlockAlign;
waveFormat.cbSize = 0;
// Open Device if valid
if (waveOutOpen(&m_hwDevice, nDeviceID, &waveFormat, (DWORD_PTR)waveOutProcWrap, (DWORD_PTR)this, CALLBACK_FUNCTION) != S_OK)
return Destroy();
}
// Allocate Wave|Block Memory
m_pBlockMemory = new T[m_nBlockCount * m_nBlockSamples];
if (m_pBlockMemory == nullptr)
return Destroy();
ZeroMemory(m_pBlockMemory, sizeof(T) * m_nBlockCount * m_nBlockSamples);
m_pWaveHeaders = new WAVEHDR[m_nBlockCount];
if (m_pWaveHeaders == nullptr)
return Destroy();
ZeroMemory(m_pWaveHeaders, sizeof(WAVEHDR) * m_nBlockCount);
// Link headers to block memory
for (unsigned int n = 0; n < m_nBlockCount; n++)
{
m_pWaveHeaders[n].dwBufferLength = m_nBlockSamples * sizeof(T);
m_pWaveHeaders[n].lpData = (LPSTR)(m_pBlockMemory + (n * m_nBlockSamples));
}
m_bReady = true;
m_thread = thread(&SoundEngine::MainThread, this);
// Start the ball rolling
unique_lock lm(m_muxBlockNotZero);
m_cvBlockNotZero.notify_one();
return true;
}
bool Destroy()
{
return false;
}
void Stop()
{
m_bReady = false;
m_thread.join();
}
// Override to process current sample
virtual double UserProcess(int nChannel, double dTime)
{
return 0.0;
}
double GetTime()
{
return m_dGlobalTime;
}
public:
static vector Enumerate()
{
int nDeviceCount = waveOutGetNumDevs();
vector sDevices;
WAVEOUTCAPS woc;
for (int n = 0; n < nDeviceCount; n++)
if (waveOutGetDevCaps(n, &woc, sizeof(WAVEOUTCAPS)) == S_OK)
sDevices.push_back(woc.szPname);
return sDevices;
}
double clip(double dSample, double dMax)
{
if (dSample >= 0.0)
return fmin(dSample, dMax);
else
return fmax(dSample, -dMax);
}
private:
unsigned int m_nSampleRate;
unsigned int m_nChannels;
unsigned int m_nBlockCount;
unsigned int m_nBlockSamples;
unsigned int m_nBlockCurrent;
T* m_pBlockMemory;
WAVEHDR *m_pWaveHeaders;
HWAVEOUT m_hwDevice;
thread m_thread;
atomic m_bReady;
atomic m_nBlockFree;
condition_variable m_cvBlockNotZero;
mutex m_muxBlockNotZero;
atomic m_dGlobalTime;
// Handler for soundcard request for more data
void waveOutProc(HWAVEOUT hWaveOut, UINT uMsg, DWORD dwParam1, DWORD dwParam2)
{
if (uMsg != WOM_DONE) return;
m_nBlockFree++;
unique_lock lm(m_muxBlockNotZero);
m_cvBlockNotZero.notify_one();
}
// Static wrapper for sound card handler
static void CALLBACK waveOutProcWrap(HWAVEOUT hWaveOut, UINT uMsg, DWORD dwInstance, DWORD dwParam1, DWORD dwParam2)
{
((SoundEngine*)dwInstance)->waveOutProc(hWaveOut, uMsg, dwParam1, dwParam2);
}
// Main thread. This loop responds to requests from the soundcard to fill 'blocks'
// with audio data. If no requests are available it goes dormant until the sound
// card is ready for more data. The block is fille by the "user" in some manner
// and then issued to the soundcard.
void MainThread()
{
m_dGlobalTime = 0.0;
double dTimeStep = 1.0 / (double)m_nSampleRate;
// Goofy hack to get maximum integer for a type at run-time
T nMaxSample = (T)pow(2, (sizeof(T) * 8) - 1) - 1;
double dMaxSample = (double)nMaxSample;
T nPreviousSample = 0;
while (m_bReady)
{
// Wait for block to become available
if (m_nBlockFree == 0)
{
unique_lock lm(m_muxBlockNotZero);
while (m_nBlockFree == 0) // sometimes, Windows signals incorrectly
m_cvBlockNotZero.wait(lm);
}
// Block is here, so use it
m_nBlockFree--;
// Prepare block for processing
if (m_pWaveHeaders[m_nBlockCurrent].dwFlags & WHDR_PREPARED)
waveOutUnprepareHeader(m_hwDevice, &m_pWaveHeaders[m_nBlockCurrent], sizeof(WAVEHDR));
T nNewSample = 0;
int nCurrentBlock = m_nBlockCurrent * m_nBlockSamples;
for (unsigned int n = 0; n < m_nBlockSamples; n += m_nChannels)
{
// User Process
for (unsigned int c = 0; c < m_nChannels; c++)
{
nNewSample = (T)(clip(MakeNoise(c, m_dGlobalTime), 1.0) * dMaxSample);
m_pBlockMemory[nCurrentBlock + n + c] = nNewSample;
nPreviousSample = nNewSample;
}
m_dGlobalTime = m_dGlobalTime + dTimeStep;
}
// Send block to sound device
waveOutPrepareHeader(m_hwDevice, &m_pWaveHeaders[m_nBlockCurrent], sizeof(WAVEHDR));
waveOutWrite(m_hwDevice, &m_pWaveHeaders[m_nBlockCurrent], sizeof(WAVEHDR)); // THIS MAKES EVERYTHING CRASH!!!
m_nBlockCurrent++;
m_nBlockCurrent %= m_nBlockCount;
}
}
virtual double MakeNoise(int a, double b) = 0;
};
Please tell me if I need to clarify anything.
**Thanks in advance!**
↧
Is native plugin support for Vulkan working?
In the example Native Rendering plugin, if SUPPORT_VULKAN is enabled, and the Vulkan API is activated in the Editor, nevertheless, m_UnityVulkan = interfaces->Get() returns a null pointer.
Why is this? Has Vulkan support been implemented for native plugins?
↧
WEBview support in WebGL..
Can we have webview in webgl or other html parsing that serve purpose of webview in WEBGL.
I need payment in app purchase native view in webgl .
Any help is really appreciated .
↧
↧
Unity-Android problems with plugins in unity 2018.2.19f1
We are getting the following error when trying to use OpenCV libraries and some native libraries created by us for Android using Unity. The point is that I have updated Unity after a long time and in the previous version it didn't give me any problem:
Exception: Unknown CPU architecture for library
Assets/Plugins/Android/libopencv_bioinspired.a
UnityEditor.Android.PostProcessor.Tasks.NativePlugins.ProcessPlugin
(UnityEditor.Android.PostProcessor.PostProcessorContext context,
System.String pluginPath, System.String pluginTargetCPU)
UnityEditor.Android.PostProcessor.Tasks.NativePlugins.Execute
(UnityEditor.Android.PostProcessor.PostProcessorContext context)
UnityEditor.Android.PostProcessor.PostProcessRunner.RunAllTasks
(UnityEditor.Android.PostProcessor.PostProcessorContext context)
UnityEditor.Android.PostProcessAndroidPlayer.PostProcess (BuildTarget
target, System.String stagingAreaData, System.String stagingArea,
System.String playerPackage, System.String installPath, System.String
companyName, System.String productName, BuildOptions options,
UnityEditor.RuntimeClassRegistry usedClassRegistry,
UnityEditor.Build.Reporting.BuildReport report)
UnityEditor.Android.AndroidBuildPostprocessor.PostProcess
(BuildPostProcessArgs args, UnityEditor.BuildProperties& outProperties)
UnityEditor.PostprocessBuildPlayer.Postprocess (BuildTargetGroup
targetGroup, BuildTarget target, System.String installPath, System.String
companyName, System.String productName, Int32 width, Int32 height,
BuildOptions options, UnityEditor.RuntimeClassRegistry usedClassRegistry,
UnityEditor.Build.Reporting.BuildReport report) (at
C:/buildslave/unity/build/Editor/Mono/-
- BuildPipeline/PostprocessBuildPlayer.cs:287)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)
↧
What is way to go to implement leaderboards/achievements for solo developer?
It really feels like it should be easy to implement leaderboards/achievements on both android and ios. But from what I see it really is not. As I find out, for an indie developer it's better to rely on "Google play games" for Android and "Game center" for IOS, however I can't afford spending months implementing both and than dealing with troubles. Should I consider buying an asset e.g [Native plugins asset][1]
[1]: https://assetstore.unity.com/packages/tools/integration/cross-platform-native-plugins-ultra-pack-31086
↧
Audio Spatializer SDK. Source/Listener location matrix
Hi all,
So I'm a little confused regarding the matrix which has positional coordinates of sources/listener. What exactly are elements 0 - 11 in the array and how exactly can I get simple information about the rotation of my objects from Unity?
I just need a simple X-Y-Z rotation number, same which appears in Unity transform section.
Thank you!
↧
How to debug by step Native C++ Library with Unity Free?
Hi everyone,
I have created a C++ dynamic library and I have linked it threw a wrapper to my unity project.
The connection is working fine but the debug is a nightmare.... :(
I would like to be able to debug (step by step) the library threw visual studio with Unity Free.
In 2016, I was working with unity Pro and it was really easy to do it but I hope 3 years later, it is now possible to do it also with the Free version?
Thanks for your help,
Regards,
Clément
↧
↧
NativePlugin c++ Android sample and self build pcl not loading,Native Plugins c++ "dll not found"
Hello unity-community,
i got some trouble with native Plugins.
I compiled the pointCloudLib (PCL) into a static and dynamic lib.
Both of them can't be found when calling them like your Dokumentation descripes. (https://docs.unity3d.com/Manual/NativePlugins.html)
Tryed import with:
[DllImport("libnative.so")]
private static extern float add(float x, float y);
[DllImport("native")]
private static extern float add(float x, float y);
With both i get same result:
System DllNotFoundException: libnative.so / native
at (wrapper managed-to-native) CallNativeCode: add(single,single)
at CallNativeCode._callAdd (Single x, Single y) [0x00000] in :0
at Thesis.Scripts.AppController.Update () [0x00000] in :0
Even your sample AndroidNativePlugin-file doesn't work for me.
(https://docs.unity3d.com/Manual/AndroidNativePlugins.html)
Maybe I'm doing something wrong.
I tryed with followed Unity-versions:
- 2018.3.0f2
- 2018.2.15f
- 2017.4.17f1
Thanks in edvance for helping me
Yours sincerely
Stefan
↧
Touchscreen input on linux (ubuntu)
Hi guys,
I'm facing quite a challenge I'm trying to have a touchscreen to work with unity on linux.
My problem is that this unity feature is actually broken (https://forum.unity.com/threads/linux-build-64-bit-and-universal-doesnt-respond-to-touch-screen-events-linux-editor-works.525748/#post-4136314)
I can't just change for something else, so I have to make it works...
I've found this comment http://answers.unity.com/comments/1377944/view.html but I don't understand how he did it. It seems that he have used xlib in a native plugin but the problem is that xlib require a window to actually get some inputs and i can't find a way to get the unity window.
Any chance some of you have an idea ?
↧
Unable to Launch ARCore unity application from native android application,Unable to open unity ARCore application from native android application
##I am trying to open an ARCore application form native android application.
**For that i did following steps**
1. Exported the unity arcore project as a gradle project
2. Converted the unity gradle project to library and generated the .aar file
3. Integrated the generated .aar file in Native android project
4. Then tried to call the unity ARCore project's .aar form native android activity
**On doing this the Unity ARCore application is getting launched but its showing only black screen, when i tried to get the adb log i found following issue.**
04-01 16:14:32.886 1385 2084 D Unity : Unable to lookup library path for 'arcore_unity_api', native render plugin support disabled.
04-01 16:14:32.887 1385 2084 E Unity : Unable to find arcore_unity_api
04-01 16:14:32.895 1385 2084 D Unity : Unable to lookup library path for 'arpresto_api', native render plugin support disabled.
04-01 16:14:32.895 1385 2084 E Unity : Unable to find arpresto_api
04-01 16:14:32.954 1385 2084 E Unity : DllNotFoundException: arcore_unity_api
I have followed [this tutorial] (https://medium.com/@davidbeloosesky/embedded-unity-within-android-app-7061f4f473a) for implementation.
**Could anyone suggest me how can i integrate these missing files in the exported ..aar file.**
,##I am trying to open an ARCore application form native android application.
**For that i did following steps**
1. Exported the unity arcore project as a gradle project
2. Converted the unity gradle project to library and generated the .aar file
3. Integrated the generated .aar file in Native android project
4. Then tried to call the unity ARCore project's .aar form native android activity
**On doing this the Unity ARCore application is getting launched but its showing only black screen, when i tried to get the adb log i found following issue.**
04-01 16:14:32.886 1385 2084 D Unity : Unable to lookup library path for 'arcore_unity_api', native render plugin support disabled.
04-01 16:14:32.887 1385 2084 E Unity : Unable to find arcore_unity_api
04-01 16:14:32.895 1385 2084 D Unity : Unable to lookup library path for 'arpresto_api', native render plugin support disabled.
04-01 16:14:32.895 1385 2084 E Unity : Unable to find arpresto_api
04-01 16:14:32.954 1385 2084 E Unity : DllNotFoundException: arcore_unity_api
I have followed [this tutorial] (https://medium.com/@davidbeloosesky/embedded-unity-within-android-app-7061f4f473a) for implementation.
**Could anyone suggest me how can i integrate these missing files in the exported ..aar file.**
↧