How to use Naudio in Unity - c#

I'm trying to use some functions of NAudio in my Unity project but don't know how to install NAudio in Unity.
What I tried:
· use NugetforUnity, installed Naudio, Naudio.core
· download Naudio-Unity.dll, put it in assets/Plugin/
When I add a script on a gameobject in my scene, write some code like:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using NAudio.Wave;
using NAudio.Wave.SampleProviders;
using Naudio-Unity;
public class NewBehaviourScript : MonoBehaviour
{
void Start()
{
for (int n = -1; n < WaveOut.DeviceCount; n++)
{
var caps = WaveOut.GetCapabilities(n);
Debug.Log($"{n}: {caps.ProductName}");
}
}
void Update()
{
}
}
There are still compile errors, Unity cannot recongnize WaveOut. Any help on this issue? Thanks in advance.

I recently encountered the same problem. The intuition thought is to download the dll file from NuGet Gallery. When I first copy to Plugins folder the Unity report error because of the windowsform api. I decided to download individual dll file and it can work.
NAudio dll in Unity Plugins folder

Related

Visual Studio Intellisense for UnityEngine not working

I am trying to write a script for Unity in C#. I created the script and then I opened it. When I typed in Ge, I don't get Unity methods (I only see GetHashCode and GetType). I am looking for GetComponent. I tried writing the script (from a tutorial) and it works fine in Unity, so only the intellisense is not working. It also doesn't work for new Vector3().
I am using Visual Studio Community 2019 (v 16.5.2).
The complete script is:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class AddConstantVelocity : MonoBehaviour
{
// Start is called before the first frame update
void Start()
{
}
// Update is called once per frame
void Update()
{
GetComponent<Rigidbody>().velocity = new Vector3(2, 10, -6);
}
}
You need to specify VS Code as the default editor.
Edit > Prefernces > External tools > External Script Editor

"GameObject" not recognized

I just started learning c# for unity today and at my 3rd line of code I run into a problem:
My VSCode doesn't "recognize" the GameObject variable. The autocomplete menu doesnt show it and if I hard-write it, it doesnt get coloured.
Im following this tutorial and I dont wanna keep going without solving this.
Its worth clarifying that I didnt install anything other than VSCode and Unity 2019.3.2 and maybe I need a some extension?
Here's the code:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class NewBehaviourScript : MonoBehaviour
{
public int health = 5;
public float fhealth = 5.0;
public GameObject player;
}
Visual Studio Code normally doesnt autocomplete Classes if not installed with Unity Package Manager. So, if you dont have any compiler errors you could just continue without autocomplete.

BuildPipeline.BuildAssetBundles deprecated (Unity)

I am developing a scene in Unity that will feature augmented reality. I would like my 3D object to be loaded from an external server. I was following a youtube tutorial on how to do that and he uses this script:
using UnityEngine;
using System.Collections;
using UnityEditor;
public class CreateAssetBundles : MonoBehaviour
{
[MenuItem("Assets/Build AssetBundles")]
static void BuildAllAssetBundles()
{
BuildPipeline.BuildAssetBundles("Assets/AssetBundles");
}
}
However, BuildPipeline.BuildAssetBundles is said to be deprecated. There are added parameters which I have no idea what to put. This is my first time using Build Pipeline. How do I fix this? I tried asking in unityanswers but sadly noone was able to help me so I'm hoping that someone can help me here.

Using Speechlib from SAPI (Microsoft text to speech API) as an Unity AudioSource

I'm building an app which has a chatbot and uses SAPI for text to speech along with SALSA asset for LypSync. What i'm trying to accomplish is to create a live AudioSource that feeds directly from TTS audio output. I have successfully accomplished this thru saving into wav files for each sentence and then loading the wav files in runtime to the GameObject that has the LypSync etc. This works, but the continuous loading of wav files makes the app be slow, freezes each time it does that and even crashes.
I know it's possible to make a live AudioSource from a microphone on the computer. So what I want to do is something like that.
I tried what from my naive level of programmer would be the logic way. Simply connect the udioOutput stream from the TTS as a AudiSource audio clip, like this:
TTSvoice.AudioOutputStream = AudioSource.clip;
and get this error:
error CS0029: Cannot implicitly convert type UnityEngine.AudioClip' to SpeechLib.ISpeechBaseStream'`SpeechLib.ISpeechBaseStream'
I know in Python you can connect audio objects from different libraries thru numpy converting audio to a standard raw array data. But I'm also kinda new to C# and Unity.
here's my code:
using UnityEngine;
using System.Collections;
using SpeechLib;
using System.Xml;
using System.IO;
using System;
using System.Diagnostics;
public class controller : MonoBehaviour {
private SpVoice voice;
public AudioSource soundvoice;
// Use this for initialization
void Start () {
voice = new SpVoice();
GameObject character = GameObject.Find("character");
soundvoice = character.GetComponent(typeof(AudioSource)) as AudioSource;
voice.AudioOutputStream = soundvoice.clip;
StartCoroutine(talksome());
}
// Update is called once per frame
void Update () {
}
IEnumerator talksome() {
while (true)
{
counter++;
string sentence = "counting " + counter;
voice.Speak(sentence);
print(sentence);
voice.WaitUntilDone(1);
yield return new WaitForSeconds(2);
}
}
}
I'm not that familiar with Unity, but it looks like what you need to do is to supply a custom PCMReaderCallback delegate to the AudioClip that would adapt the data from the AudioOutputStream (in particular, it needs to normalize the data from 16-bit ints to floats).

SpeechLib.SpVoiceClass:GetVoices does crash my Unity executable

I created a project with Unity (version 4.5.3f3).
I only wrote a simple script as follow:
using UnityEngine;
using System.Collections;
using SpeechLib;
public class SpeechTest : MonoBehaviour
{
private SpVoice voice;
void Start()
{
voice = new SpVoice();
voice.GetVoices("","");
}
// Update is called once per frame
void Update()
{
if (Input.anyKeyDown)
{
voice.Speak("Hello, world!", SpeechVoiceSpeakFlags.SVSFlagsAsync);
}
}
}
Here you can download the test project for Unity: https://dl.dropboxusercontent.com/u/12184013/TextToSpeech.zip
When I try to play (in Unity editor), the game runs without problems.
Instead, when I build and run the game, it crashes.
When i comment this line
ISpeechObjectTokens voices = voice.GetVoices();
the game doesn't crash after I rebuilt it.
I need to call GetVoices method, because I want to set a new voice in the "SpVoice voice" object.
Here is the solution: http://forum.unity3d.com/threads/speechlib-spvoiceclass-getvoices-does-crash-my-unity-executable.268011/#post-1772720
In a nutshell the library [Unity Install Directory]\Editor\Data\Mono\lib\mono\2.0\CustomMarshalers.dll should be included in the Unity project (adding it as a asset).

Categories